Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Non-uniform processor allocation in domain-decomposed simulations #246

Draft
wants to merge 7 commits into
base: dev
Choose a base branch
from

Conversation

alexandermote
Copy link
Contributor

Opening a draft PR to get fresh eyes on my new DD code. We can now handle decomposed mesh tallies of varying sizes, and the dd_slab_reed test should pass when run with 4 processors. Currently working on getting dd_slab_reed and dd_cooper to pass when run with multiple processors per subdomain; from there, we should be able to add non-uniform work ratios fairly easily.

@alexandermote
Copy link
Contributor Author

Apparently I wasn't running on the most up-to-date version of dev; as a result, it looks like the DD code isn't working. I'll look into it and see if I can figure out where the issue is.

@alexandermote
Copy link
Contributor Author

The tally values I get from this version are identical to the ones I was getting on my branch; this made me wonder if the answer.h5 file had somehow changed. Running the dd_slab_reed problem without domain decomposition, and using that output as the answer.h5 for the regression test, causes the test to pass again. It's possible that the changes I made to the dd_slab_reed input caused its output to change from the existing version, but since the new version is accurate to a non-DD simulation, I believe it is the correct output.

@alexandermote
Copy link
Contributor Author

The dd_slab_reed test succeeds in Python and Numba modes on my Dane build; the tests fail on Github because it is trying to run with only 1 processor. Not sure why that's happening; my only guess is it's a difference between calling --mpiexec and --srun?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants