-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MPI barriers in IPE? #10
Comments
Is this a problem with IPE running coupled to WAM? If it's just a performance issue then this would seem low-priority, unless the performance is terrible for some reason? |
It is a low-priority issue, indeed. However, it should be addressed before code delivery since MPI barriers do affect runtime performance and may mask code issues when run in parallel. |
Rafaelle - looking at the code, MPI_barrier is only used (during actual runtime) for writing out the IPE hdf5 'state' files. I think the barrier is in there because Joe had problems with the hdf5 files (ie, they were bogus/corrupted) when there was no barrier. I seem to recall Joe found a few bugs in hdf5 itself....but yeh, if it's a real performance issue then we can see what can be done. |
Is it solved? should we close it? |
Active MPI barriers have been eliminated after introducing the updated HDF5 I/O layer in IPE. One barrier still remain in the unused legacy subroutine |
|
A few MPI barriers are used in the refactored IPE code.
We should discuss whether such barriers are required or could be eliminated since they affect runtime performance.
The text was updated successfully, but these errors were encountered: