Autocomplete LRU Cache Design Choice #8315
Ayyanaruto
started this conversation in
Feature Requests
Replies: 1 comment 1 reply
-
|
I think the original concern was with memory bloat in larger repos, but that seems like a non-issue with good design. Seems like an interesting potential improvement to me |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey Continue team 👋
I was going through the Autocomplete implementation and noticed that the LRU cache uses SQLite with queries. I was wondering why not use an in-memory cache instead?
For example, using a simple Map in Node.js could make get/put operations faster by skipping DB queries. Or maybe a hybrid approach could work keeping recent entries in memory and syncing them to SQLite periodically in batches if persistence is needed.
Just curious about the thought process behind this design choice :👀
Beta Was this translation helpful? Give feedback.
All reactions