You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
) we are just going thru all lists, and trying to fetch them in same time. With big number of lists/big number of entries in list we can hit this memory spike.
If we would be able to limit how many lists should be downloaded in same time it should help.
EDIT: I see that there are some download options (e.g downloadTimeout), can we just add download
Concurrency? :)
The text was updated successfully, but these errors were encountered:
Yes, you're right. Blocky downloads all lists per group simultaneously. Each new downloaded list will be processed in-memory and the cache content will be replaced at the end. That means, we need temporarily the double amount of memory (to hold the old version and to process the new one).
It would make sense to introduce a new parameter "downloadConcurrency" or something similar.
You can also try to create multiple groups with less lists, this should also help
Im using blocky with ~19 blacklists, every refreshPeriod i see high memory usage spike (~50MB -> ~150MB -> ~50MB).
I tries to run all my pods with memory limits, to avoid OOM on HV, and this memory spikes sometimes leads to blocky killed by K8s.
Can we introduce some method to limit how many links will be downloaded in same time?
Afaik currently (
blocky/lists/list_cache.go
Line 145 in f8b6e59
If we would be able to limit how many lists should be downloaded in same time it should help.
EDIT: I see that there are some download options (e.g downloadTimeout), can we just add download
Concurrency? :)
The text was updated successfully, but these errors were encountered: