Dealing with Dask memory issues in the run script, #2 #87
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Following up on PR #85, I noticed that despite this changes I was again seeing memory warnings:
And no, or slow, progress in when running
run_og_usa.py
when running with OG-Core 0.11.0 (which increases the size of several objects in theSpecifications
class object that contains the model parameters.After some profiling, I noticed that the "bytes stored" in the
dask
processes became quite large with the tax function estimation, but then remained quite high after that task completed:This PR helps deal with that issue by closing and deleting the client after tax function estimation and then instantiating a new client before the model solution algorithm is called.
This has seemed to avoid these issues.