Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: lock lru-cache to v6.0.0 #170

Merged
merged 1 commit into from
Mar 12, 2022
Merged

Conversation

wschurman
Copy link
Member

Why

lru-cache v7.x is optimized for non-TTL caching: isaacs/node-lru-cache#208. Instead, it's optimized for GC minimization and does this by pre-allocating the entire data structures at time of construction: https://github.com/isaacs/node-lru-cache/blob/main/index.js#L119

For our use case, we know we'll be using a short TTL so we're not too worried about memory or GC being an issue. In fact, pre-allocating all the data structures ahead of time drastically slows down our server (especially our test suite).

The old implementation is more aligned with our use case since it doesn't do any allocation ahead of time: v6.0.0 implementation: https://github.com/isaacs/node-lru-cache/blame/dcd2384fd3321d86dc6ebb3cee3e9601727d1555/index.js

Other (current) libraries were investigated as alternatives but seemed a bit heavy:

How

Reverse course and lock at 6.0.0, install correct typedefs.

Test Plan

Wait for CI.

@codecov
Copy link

codecov bot commented Mar 12, 2022

Codecov Report

Merging #170 (e15752d) into master (02d2f24) will decrease coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #170      +/-   ##
==========================================
- Coverage   96.15%   96.14%   -0.01%     
==========================================
  Files          80       80              
  Lines        1975     1972       -3     
  Branches      211      209       -2     
==========================================
- Hits         1899     1896       -3     
  Misses         75       75              
  Partials        1        1              
Flag Coverage Δ
integration 96.14% <100.00%> (-0.01%) ⬇️
unittest 96.14% <100.00%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...apter-local-memory/src/GenericLocalMemoryCacher.ts 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 02d2f24...e15752d. Read the comment docs.

@wschurman wschurman merged commit 293868b into master Mar 12, 2022
@wschurman wschurman deleted the @wschurman/lru-cache-version branch March 12, 2022 21:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant