Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: hypre improvements #3339

Merged
merged 49 commits into from
Nov 14, 2024
Merged

feat: hypre improvements #3339

merged 49 commits into from
Nov 14, 2024

Conversation

victorapm
Copy link
Contributor

@victorapm victorapm commented Sep 9, 2024

  • Replace internal GEOS SDC implementation with hypre's
  • Remove tracking of certain SDC times
  • Add hypre log level support
  • Update hypre umpire pools naming
  • Add missing coarse solver options in AMG
  • Remove unused SDC variables in GEOS data structures
  • Add separateComponents option to mechanics solver setup in MGR
  • Move addCommaSeparators to StringUtilities
  • Add unknowns and nonzeros info to linear solver message, e.g.:
Linear Solver | Success | Unknowns: 645,162 | Nonzeros: 42,898,950 | Iterations: 39 | Final Rel Res: 9.4042e-05 | Setup Time: 0.538 s | Solve Time: 0.933 s

Requires GEOS-DEV/thirdPartyLibs#286

@victorapm victorapm self-assigned this Sep 9, 2024
@ryar9534
Copy link
Contributor

ryar9534 commented Sep 10, 2024

Here is a quick test I ran on a version of SPE10 with burdens, compositional poromechanics, though with homogeneous properties, (so its just a high perm slab between two low perm boxes). Should be about 40 million DoF, and I ran on Dane.

Here were the results with develop a few weeks ago

Mpi ranks Total Run Time (s) Linear Solver Setup (s) Linear Solver Solve (s) Total GMRES Iterations (always 43 Total Newton iters)
64 2087 523 1966 1850
128 1451 343 1381 1833
256 963 234 662 1890
512 624 182 404 1974

Here are the results with the new Hypre

Mpi ranks Total Run Time (s) Linear Solver Setup (s) Linear Solver Solve (s) Total GMRES Iterations (always 43 Total Newton iters)
64 1568 233 1162 2281
128 1115 139 865 2229
256 723 85 572 2293
512 470 85 347 2366

Note that this does not double the number of nodes for each test, I will do that now

@ryar9534
Copy link
Contributor

ryar9534 commented Sep 11, 2024

Old results for heterogeneous SPE10 compositional poromechanics on CPU (dane)

Dane cores (Mpi ranks) Total Run Time (s) Linear Solver Setup (s) Linear Solver Solve (s) Time steps/Nonlin iters / Lin iters
2 (224) 2409 488 1764 24/76/3464
4 (448) 1440 380 977 24/76/3862
8 (896) 1075 447 583 24/76/4286

New results for heterogeneous SPE10 compositional poromechanics on CPU (dane)

Dane cores (Mpi ranks) Total Run Time (s) Linear Solver Setup (s) Linear Solver Solve (s) Time steps/Nonlin iters / Lin iters
2 (224) 1770 185 1425 24/76/3882
4 (448) 952 152 719 24/76/3960
8 (896) 686 225 417 24/76/4228

@victorapm victorapm marked this pull request as ready for review October 4, 2024 20:12
HYPRE_SetUmpirePinnedPoolName( "HYPRE_PINNED" );
#endif

HYPRE_SetLogLevel( getenv( "HYPRE_LOG_LEVEL" ) ? atoi( getenv( "HYPRE_LOG_LEVEL" ) ) : 0 );
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can it be set by linear solver log level?

Copy link
Contributor Author

@victorapm victorapm Oct 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but I did this way because the information provided here is mainly for developers. Other libraries such as umpire and rocsparse work this way as well (using env variable). I am open to change the strategy here if needeed

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sounds reasonable, thanks, no need to change

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's not use environment variables please.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can remove this. What is the issue with env vars again?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I personally don't like them. It's hard to keep track of who sets them. Also we don't really do it for anything else in GEOS.

Copy link
Contributor Author

@victorapm victorapm Oct 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here's a place where it's being used:

int columns = getenv( "COLUMNS" ) ? atoi( getenv( "COLUMNS" )) : 120;

HYPRE_LOG_LEVEL is an option supposed to be used by developers only. It's very convenient to turn it on/off if it's a env var wrt a XML input. I don't see any harm with using env var specifically, but I'm happy to remove it if that's the best solution

@victorapm victorapm added the flag: requires rebaseline Requires rebaseline branch in integratedTests label Nov 8, 2024
@victorapm
Copy link
Contributor Author

@castelletto1 @CusiniM @rrsettgast Can you guys take it from here? It seems there's nothing else I can do. Thanks!

@CusiniM
Copy link
Collaborator

CusiniM commented Nov 12, 2024

@castelletto1 @CusiniM @rrsettgast Can you guys take it from here? It seems there's nothing else I can do. Thanks!

is this basically ready? I mean it clearly needs an LvArray PR and a rebaseline but is it ready otherwise?

@victorapm
Copy link
Contributor Author

Right, it's ready. Needs a LvArray PR and LC build, which I can't do. Rebaseline is due to field value change (amgNumFunctions)

@paveltomin paveltomin merged commit e3904f2 into develop Nov 14, 2024
23 of 25 checks passed
@paveltomin paveltomin deleted the feature/paludettomag1/hypre-sdc branch November 14, 2024 21:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci: run code coverage enables running of the code coverage CI jobs ci: run CUDA builds Allows to triggers (costly) CUDA jobs ci: run integrated tests Allows to run the integrated tests in GEOS CI flag: requires rebaseline Requires rebaseline branch in integratedTests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants