-
Notifications
You must be signed in to change notification settings - Fork 54
ProtectedArray type incompatible with newer numpy #119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
A reconstruction with this line removed is chugging along now. I'll update when it finishes. |
This is also failing with a memory limit error though I've certain there's ample RAM left on the machine. The issue might be more rooted in MPI. |
Another slight update, I get the same error with openmpi as I do with mpich:
This is a single machine with 64 GB of RAM and 8 physical cores. The same error is happening regardless of the number of cores I throw at mpi. I'll keep updating as I make progress. |
I've confirmed the Regarding the memory limit issues, could you clarify for me what the problem is. My understanding is that your machine has 8GB of RAM per core, and agree this should be plenty. But the script you cited above tells pyGSTi to (roughly) limit the per-core memory usage to 2.1GB in the line:
The memory limits in pyGSTi aren't super precise, and so I'd set |
Oh - and I forgot to mention, the hack-fix of deleting the offending |
I've been looking into this issue for longer than I expected. There seems to be an issue with Numpy arrays preserving their flags when getting pickled/unpickled. For example, when I run this code (using Python 3.7 and Numpy 1.18.1):
I find that when the I've added some additional commits to deal with this weirdness that should make pyGSTi less reliant on Numpy array flags and thus more robust to these sorts of issues. The latest commit to the beta branch (209cee0) I think has everything up and running again using Numpy 1.18.1. |
Update: Increasing the |
So the reconstruction seems to have worked. @enielse, are there any rules of thumb about how much memory one should allocate? |
Glad to hear it worked with the increased The reasons that
To summarize: the |
Hi Erik! Sounds good. I'll stick with this rule of thumb going forward. Thanks for summarizing things for future readers. I'm going to mark the issue solved by 209cee0 |
Trying to run a GST reconstruction with mpi and numpy > 1.15 seems to cause issues with the ProtectedArray class.
https://github.com/pyGSTio/pyGSTi/blob/3ea46e3837d8a6908367c04312ec1dddbef3e2bd/packages/pygsti/baseobjs/protectedarray.py#L48
Based on similarity with:
pandas-dev/pandas#24839
spyder-ide/spyder#8582
https://html.developreference.com/article/10573243/Anaconda+Pandas+breaks+on+reading+hdf+file+on+Python+3.6.x
Seems like downgrading to numpy 1.15.0 should fix this issues. I'm still testing that.
To Reproduce
mpiexec -n 3 python3 "example_files/mpi_example_script.py"
Expected behavior
A GST reconstruction using mpi
Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: