You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running Sylvan on machines with at least 32GB RAM. I set the memory cap in the function sylvan_set_limits to 28GB (table ratio 3 and init ratio 5). However, Sylvan exits with the "BDD Unique table full" error. I checked the max resident set size (as reported by time command) and it was always at most ~15GB . In other words, I suspect Sylvan is not using the full available RAM. Is there a workaround?
Thanks,
Aditya
The text was updated successfully, but these errors were encountered:
Sylvan always picks for the table and cache size the largest power of two that fits into the given amount of memory (that also satisfies the given cache-to-nodes ratio). So, yes indeed it is true that Sylvan does not make use of all the space you have available.
Whether there is a workaround is a good question. You can try to change the sylvan_set_limits function such that it is much more tight, but I am unsure whether that breaks any of the invariants in the remaining code. Maybe it only breaks table resizing in which case you may want to try it with an init ratio of 0 and hope the added cache misses/memory-initialisation do not suddenly introduce a major slowdown.
Thank you. For now I just increased the memory cap to ~32GB which made it slightly more than the nearest power of two, and now most of the RAM is getting used. However, it would be good to know if sylvan_set_limits can be made tighter without breaking anything else.
Hi,
I am running Sylvan on machines with at least 32GB RAM. I set the memory cap in the function
sylvan_set_limits
to 28GB (table ratio 3 and init ratio 5). However, Sylvan exits with the "BDD Unique table full" error. I checked the max resident set size (as reported by time command) and it was always at most ~15GB . In other words, I suspect Sylvan is not using the full available RAM. Is there a workaround?Thanks,
Aditya
The text was updated successfully, but these errors were encountered: