Skip to content

Limit maximum CLTV delta during routing. #1234

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jan 20, 2022

Conversation

tnull
Copy link
Contributor

@tnull tnull commented Jan 11, 2022

This PR attempts to address #996. As this is my first contact with the ~950 lines of get_route, I hope that I didn't mess up too much. I also fixed some typos I found while going through the code.

Three remarks though:

  1. Currently a lot of unit tests fail. This seems to be the case since many of the supplied cltv_expiry_delta values in the dummy graph data are chosen very high. If all agree that the failures do not come from something else, I could change them to some sane(r)(?) values, even though I'm not entirely sure yet what they would be.
  2. How should we choose the MAX_TOTAL_CLTV_DELTA limit? The value of 1000 seems like a good starting point, however, a decent number of nodes use values that are close enough or higher than that (see attached data I took from a recent graph snapshot). These might be excluded from any payment paths, which could impair connectivity and payment success.
  3. In order to be minimal invasive I currently set this MAX_TOTAL_CLTV_DELTA limit inline. But maybe this parameter should be configurable, i.e., by defining it as part of RouteParameters or a new RoutingConfig in config.rs?

Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great! Some minor comments and could use a test to demonstrate it works (plus fix the tests that are failing), but looks great.

@TheBlueMatt
Copy link
Collaborator

Basically looking good, needs to be configurable and needs a test for the max-expiry behavior specifically.

@jkczyz jkczyz self-requested a review January 12, 2022 22:43
@codecov-commenter
Copy link

codecov-commenter commented Jan 13, 2022

Codecov Report

Merging #1234 (f4e57f9) into main (10204f7) will decrease coverage by 0.00%.
The diff coverage is 62.06%.

❗ Current head f4e57f9 differs from pull request most recent head 367ed31. Consider uploading reports for the commit 367ed31 to get more accurate results
Impacted file tree graph

@@            Coverage Diff             @@
##             main    #1234      +/-   ##
==========================================
- Coverage   90.41%   90.41%   -0.01%     
==========================================
  Files          70       70              
  Lines       38087    38116      +29     
==========================================
+ Hits        34437    34463      +26     
- Misses       3650     3653       +3     
Impacted Files Coverage Δ
lightning/src/routing/router.rs 91.89% <62.06%> (-0.17%) ⬇️
lightning/src/ln/channel.rs 89.35% <0.00%> (-0.03%) ⬇️
lightning/src/ln/functional_tests.rs 97.36% <0.00%> (+0.06%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 10204f7...367ed31. Read the comment docs.

@TheBlueMatt TheBlueMatt added this to the 0.0.105 milestone Jan 13, 2022
@tnull
Copy link
Contributor Author

tnull commented Jan 14, 2022

Basically looking good, needs to be configurable and needs a test for the max-expiry behavior specifically.

I now added a simple test against the default value and limit of 0 with 0c93422.
Let me know if you think more in-depth testing is desirable.

Copy link
Contributor

@jkczyz jkczyz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks pretty good so far. Thanks for putting this together!

Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. A few super-trivial comments that might as well be addressed when squashing, but I'd let @jkczyz follow up first.

@tnull tnull force-pushed the limit_max_cltv_delta branch 3 times, most recently from 266ddd6 to 6599a32 Compare January 19, 2022 10:46
@tnull
Copy link
Contributor Author

tnull commented Jan 19, 2022

Rebased and squashed now, should be ready to merge.

@tnull tnull force-pushed the limit_max_cltv_delta branch from 6599a32 to b6b3296 Compare January 19, 2022 12:03
Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One API comment, otherwise ACK

Copy link
Contributor

@jkczyz jkczyz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, looks good module some nits.

@tnull tnull force-pushed the limit_max_cltv_delta branch from b6b3296 to ebecbdd Compare January 19, 2022 17:18
jkczyz
jkczyz previously approved these changes Jan 19, 2022
@tnull tnull force-pushed the limit_max_cltv_delta branch 2 times, most recently from ced9bfc to f4e57f9 Compare January 20, 2022 08:51
TheBlueMatt
TheBlueMatt previously approved these changes Jan 20, 2022
@tnull tnull force-pushed the limit_max_cltv_delta branch from 60b4bb7 to 367ed31 Compare January 20, 2022 15:00
Copy link
Contributor

@jkczyz jkczyz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ACK 367ed31

@jkczyz jkczyz merged commit d741fb1 into lightningdevkit:main Jan 20, 2022
@tnull tnull deleted the limit_max_cltv_delta branch March 9, 2022 22:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

4 participants