-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Weak LRU Cache Map #1519
Weak LRU Cache Map #1519
Conversation
@mhofman, I implemented your suggested "composite-key-like" solution. Tested successfully locally that deep stacks work, and enabling/disabling them does not change CapTP's GC behaviour. PTAL when you have time. This is not urgent. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is still a leak of an associated value if the cell falls off the cache, but the key is kept around.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some typing nits, but I think we need to handle the set
operation where the cell
exists but cell.data
was nulled out.
We probably need unit test for all this eviction logic.
fd64167
to
e85d397
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry I found one last issue on deletion: a budget underflow.
948a503
to
99073fc
Compare
99073fc
to
1d7d87c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Conditional approval on unit tests being added for the LRU cache logic.
Would love to see a review from @erights to know if we got it right.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would love to see a review from @erights to know if we got it right.
If you two come to be confident that this solution is right, don't want long for me to chime in. Feel free to proceed anyway. But do become confident first ;)
My only hurry is that, if this PR is right, I'd like to see it in the next endo release rather than my unfixed change.
@@ -1,3 +1,4 @@ | |||
// @ts-check |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't this the default now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's opt-in, and packages/ses/tsconfig.json
doesn't opt in.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hard to develop habits if we're not consistent. Any reason we don't opt in everywhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The only reason is that we'd have to fix all the type errors before we can land a PR to opt in everywhere. I don't know the scope of that, nor how long that would take.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or to start by opting out everywhere that got broken by implicitly opting in.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One other caveat to making it default is that the option to do that is checkJs: true
and it doesn't distinguish between JS in the repo and in node_modules. E.g. with the option on, packages/cli reports,
Errors Files
32 ../../node_modules/url/node_modules/punycode/punycode.js:8
17 ../../node_modules/url/url.js:156
5 src/endo.js:71
7 ../compartment-mapper/src/bundle-cjs.js:18
Adding node_modules
to "exclude" doesn't ignore them because they're transitively imported by modules in src. Adding skipLibCheck: true
doesn't solve it either. It's a long-standing issue: https://github.com/microsoft/TypeScript/issues/40426
I am pretty confident it is sound. I will be 100% confident once we have unit tests for the edge cases that I raised during review: cache eviction then re-insertion or deletion, and making sure these do not unexpectedly affect the cache size. Testing cache size / eviction behavior is hard to test, as it can only be done by inference: check that inserting a new entry results in the tail entry to be evicted or not (if at capacity), or that a further insertion causes an eviction or not). |
1d7d87c
to
6e013e9
Compare
6e013e9
to
f95480f
Compare
be4a5d9
to
411491b
Compare
e021b62
to
c12ec30
Compare
Refs #1513
I tracked down the object leak to where cause stacks are being attached to a given eventual-send. This pointed to a strongly-held chain of errors and their details arguments.
This weakifying of the LRUCacheMap uses an outer WeakMap instead of the strong map, and then a unique WeakMap instance per non-condemned entry in the cache. That is, given N entries in the cache, 1+N WeakMaps in total (!). This is because WeakMaps don't support composite keys, and we are avoiding using WeakRefs in the Endo implementation.
The benefit is that we no longer have a leak. Now the CapTP GC drop counts are the same as before #1513 (3 drop messages instead of 2 for
packages/captp/test/test-gc.js
).