-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Subdivide tasks to prevent Luajit OOM #122
Comments
Or use nonluajit version instead :/ |
+1 If we know luajit can only handle say 50000 nodes then a selection of 1000000 could be saved into 20 temporary files and then concatenated into a single file. This fearture would be sooooo nice. Using a non luajit version would only give you 4x the area. @Fixer-007's original solution would give you an essentially unlimited area save size. Perhaps the only limit is that the concatenated file must be less than 1GB. |
Just seen this while browsing the issues here. I've implemented a The documentation for the command can be found here: https://github.com/sbrl/Minetest-WorldEditAdditions/blob/master/Chat-Command-Reference.md#subdivide-size_x-size_y-size_z-cmd_name-args I'm still tweaking and refining it, but it seems pretty stable. I'm currently chasing down an issue whereby it gets stuck a random percentage of the way through, but it seems that this only occurs in low-resource environments (i.e. on a Raspberry Pi 3B+. So long as you have plenty of RAM it should work fine. In short, use the command like so:
This would run |
Sometimes If you are doing something big with worldedit you may end up with luajit OOM, I wonder if it is possible for worldedit to divide big tasks into smaller ones to avoid OOMs... (like dividing big volume into smaller chunks and processing each one - one by one).
The text was updated successfully, but these errors were encountered: