You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @binjie09. My PR #38 fixes this by requiring the cache to be passed in explicitly. That way you can control it however you like. You could even implement a custom Map that removes old entries once some limit is reached.
If you can't wait for the PR to get merged, I've published my fork as gpt-tokenizer.
I use a large amount of Chinese in the GPT service, and Chinese phrases here will occupy a significant amount of memory.
After running for one day, it occupies more than 1G of memory, which made me think there was a memory leak in my code for a moment.
GPT-3-Encoder/Encoder.js
Lines 87 to 153 in 9df47fc
The text was updated successfully, but these errors were encountered: