Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add regionally-refined grid over TWP #1693

Merged
merged 4 commits into from
Aug 24, 2017
Merged

Conversation

brhillman
Copy link
Contributor

@brhillman brhillman commented Aug 2, 2017

Add configuration for new atmosphere and land grid with regionally refinement from ne30 to ne120 over the Tropical Western Pacific region. All needed input files to run the new configuration have been uploaded to the input data repository. Supported compsets include only FC5AV1C-04P2 (72 levels).

Fixes #1357

[BFB]

Add CIME configuration for running new TWP RRM grid.
RRM grids using the se dycore should all start with ne0, but CAM_DYCORE
was only set explicitly when the atmosphere grid started with ne[1-9].
Setting this to se when the grid begins with ne[0-9] sets CAM_DYCORE
properly for all RRM grids. Fixes #1357.
Add CAM configuration and defaults for running new TWP RRM grid.
Add default inputdata paths for running CLM on new TWP RRM grid.
@brhillman brhillman requested review from mt5555 and oksanaguba August 2, 2017 16:18
@rljacob rljacob assigned mt5555 and AaronDonahue and unassigned mt5555 Aug 5, 2017
@golaz golaz added this to the v1.0beta2 milestone Aug 7, 2017
brhillman added a commit that referenced this pull request Aug 8, 2017
Set RTM grid to null in grid longname for RRM configurations to turn off
the river runoff model. Required input files do not exist for running
RTM with the RRM grids, so previously this required manually setting
RTM_MODE=NULL for RRM grids. Once PR #1629 are #1693 merged as well, the
supported RRM grids (CONUS, TWP, and ENA) should run without further
user intervention. For now, the user still needs to set CAM_DYCORE=se,
due to issue #1357.
brhillman added a commit that referenced this pull request Aug 10, 2017
Set RTM grid to null in grid longname for RRM configurations to turn off
the river runoff model. Required input files do not exist for running
RTM with the RRM grids, so previously this required manually setting
RTM_MODE=NULL for RRM grids. Once PR #1629 are #1693 merged as well, the
supported RRM grids (CONUS, TWP, and ENA) should run without further
user intervention. For now, the user still needs to set CAM_DYCORE=se,
due to issue #1357.
@AaronDonahue
Copy link
Contributor

@singhbalwinder , @rljacob , I've finished nearly everything to push this to Next, but I have a few fails in the acme developer suite:

FAIL SMS_Ln9.ne4_ne4.FC5AV1C-L.anvil_intel.cam-outfrq9s time=59
FAIL SMS_D_Ln5.ne4_ne4.FC5.anvil_intel RUN time=47
FAIL SMS_D_Ln5.ne4_ne4.FC5AV1C-L.anvil_intel RUN time=58
FAIL SMS.f09_g16_a.IGCLM45_MLI.anvil_intel RUN time=35
FAIL ERS_Ln9.ne4_ne4.FC5AV1C-L.anvil_intel.cam-rtm_null RUN time=45

Are these fails that we would expect and I can just go ahead and push to next?

@rljacob
Copy link
Member

rljacob commented Aug 11, 2017

This PR should not cause any existing tests to fail. Did those tests pass with master for the platform you're using?

@singhbalwinder
Copy link
Contributor

@AaronDonahue : Like @rljacob mentioned, this PR should not break existing tests. I would first look at one of the failing tests to see why it failed. If I can't see an obvious reason, I would run the same test using current master to see if it fails. I would choose SMS_D_Ln5.ne4_ne4.FC5 for this as this is the simplest test case.

It goes without saying but if this test fails using current master then this PR obviously didn't break that test.

@AaronDonahue
Copy link
Contributor

@singhbalwinder , Thank you, I will try master. I did check the errors and I think I understand them. All of the ne4 mesh tests failed because the land model can't handle 720 cores, that is more cores than there are gridpoints in the land mesh. So that has nothing to do with the PR, just with the submission using too many cores.

The final FAIL, SMS.f09_g16_a.IGCLM45_MLI.anvil_intel, had something to do with elevation classes which again I don't think would be associated with this PR.

@singhbalwinder
Copy link
Contributor

@singhbalwinder , Thank you, I will try master. I did check the errors and I think I understand them. All of the ne4 mesh tests failed because the land model can't handle 720 cores, that is more cores than there are gridpoints in the land mesh. So that has nothing to do with the PR, just with the submission using too many cores.

Thanks. Yes, that would break the simulation and it is not related to this PR, so it should be fine. Just to be sure, you can run SMS_D_P32x1_Ln5.ne4_ne4.FC5 (or a variant of this). This test ensures that each component gets only 32 tasks (_P32x1).

The final FAIL, SMS.f09_g16_a.IGCLM45_MLI.anvil_intel, had something to do with elevation classes which again I don't think would be associated with this PR.

I do not know about this failure. @bishtgautam : Do you expect this test to fail due to elevation classes code?

@AaronDonahue : It would be beneficial if you can copy the error message here. Thanks!

@AaronDonahue
Copy link
Contributor

I could be way off on the elevation classes, I'm basing it on seeing the following warning message:

` get_glc_elevation_classes: WARNING, for glc_pt, topo = 33
-570.381878807675
Topographic height below the lower bound of the lowest elevation class

get_glc_elevation_classes: WARNING, for glc_pt, topo = 34
-758.880539182521
Topographic height below the lower bound of the lowest elevation class

get_glc_elevation_classes: WARNING, for glc_pt, topo = 35
-708.216219656158
Topographic height below the lower bound of the lowest elevation class

get_glc_elevation_classes: WARNING, for glc_pt, topo = 36
-653.435608094268
Topographic height below the lower bound of the lowest elevation class

get_glc_elevation_classes: WARNING, for glc_pt, topo = 37
-566.451650397649
Topographic height below the lower bound of the lowest elevation class`

repeated many times in the acme.log file finished by the following:

=================================================================================== = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES = PID 133393 RUNNING AT b565 = EXIT CODE: 233 = CLEANING UP REMAINING PROCESSES = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES ===================================================================================

@rljacob
Copy link
Member

rljacob commented Aug 11, 2017

@jayeshkrishna does acme_developer pass on anvil with intel?

@jayeshkrishna
Copy link
Contributor

I haven't tried it (anvil+intel+acme_developer) recently.

@rljacob
Copy link
Member

rljacob commented Aug 11, 2017

@brhillman did you run acme_developer somewhere for this PR?

@AaronDonahue
Copy link
Contributor

I reran "SMS.f09_g16_a.IGCLM45_MLI" on blues and it passed. So it could be something to do with running on Anvil with Intel, or running that particular test with 720 processors.

@rljacob
Copy link
Member

rljacob commented Aug 11, 2017

So do you testing on blues or you can skip it if Bill says he ran them already.

@AaronDonahue
Copy link
Contributor

Sounds good, I reran all the failed tests on Blues and just have one more pending, but the others passed, so I think we are good. Sorry to cause a stir with my original fails.

@jayeshkrishna
Copy link
Contributor

Looks like we need some machine-specific PE layouts for anvil + ne4_ne4 .

@brhillman
Copy link
Contributor Author

@rljacob yes I ran acme_developer. All tests that passed on master on skybridge passed on this branch before and after making the changes, and I did baseline comparisons before and after as well and all were bit for bit consistent. There are some tests failing on skybridge on master right now, but those are unrelated to this PR and we're failing before I made these changes, and are not the same fails reported here.

@rljacob
Copy link
Member

rljacob commented Aug 13, 2017

Good. In that case, Aaron, you can proceed with integrating this. Anvil pe layouts can be done in a separate PR.

@AaronDonahue AaronDonahue merged commit 3c82243 into master Aug 24, 2017
AaronDonahue pushed a commit that referenced this pull request Aug 24, 2017
Add configuration for new atmosphere and land grid with regionally refinement from ne30 to ne120 over the Tropical Western Pacific region. All needed input files to run the new configuration have been uploaded to the input data repository. Supported compsets include only FC5AV1C-04P2 (72 levels).

Fixes #1357

[BFB]

Conflicts:
	cime/src/drivers/mct/cime_config/config_component_acme.xml
	components/cam/bld/config_files/horiz_grid.xml
	components/cam/bld/namelist_files/namelist_defaults_cam.xml
	components/clm/bld/namelist_files/namelist_defaults_clm4_5.xml
	components/clm/bld/namelist_files/namelist_definition_clm4_5.xml
@brhillman brhillman deleted the brhillman/atm/add-twp-rrm branch August 28, 2017 15:48
jgfouca pushed a commit that referenced this pull request Oct 25, 2017
Set RTM grid to null in grid longname for RRM configurations to turn off
the river runoff model. Required input files do not exist for running
RTM with the RRM grids, so previously this required manually setting
RTM_MODE=NULL for RRM grids. Once PR #1629 are #1693 merged as well, the
supported RRM grids (CONUS, TWP, and ENA) should run without further
user intervention. For now, the user still needs to set CAM_DYCORE=se,
due to issue #1357.
jgfouca pushed a commit that referenced this pull request Oct 25, 2017
Add configuration for new atmosphere and land grid with regionally refinement from ne30 to ne120 over the Tropical Western Pacific region. All needed input files to run the new configuration have been uploaded to the input data repository. Supported compsets include only FC5AV1C-04P2 (72 levels).

Fixes #1357

[BFB]

Conflicts:
	cime/src/drivers/mct/cime_config/config_component_acme.xml
	components/cam/bld/config_files/horiz_grid.xml
	components/cam/bld/namelist_files/namelist_defaults_cam.xml
	components/clm/bld/namelist_files/namelist_defaults_clm4_5.xml
	components/clm/bld/namelist_files/namelist_definition_clm4_5.xml
jgfouca added a commit that referenced this pull request Feb 6, 2018
User compset fix

Rearrange code in case.py so that compset alias and compset longname get the same result.

Test suite: scripts_regression_tests.py, test described in #1693
Test baseline:
Test namelist changes:
Test status: bit for bit
Fixes #1693

User interface changes?:

Update gh-pages html (Y/N)?:

Code review:
jgfouca pushed a commit that referenced this pull request Feb 27, 2018
Set RTM grid to null in grid longname for RRM configurations to turn off
the river runoff model. Required input files do not exist for running
RTM with the RRM grids, so previously this required manually setting
RTM_MODE=NULL for RRM grids. Once PR #1629 are #1693 merged as well, the
supported RRM grids (CONUS, TWP, and ENA) should run without further
user intervention. For now, the user still needs to set CAM_DYCORE=se,
due to issue #1357.
jgfouca pushed a commit that referenced this pull request Feb 27, 2018
Add configuration for new atmosphere and land grid with regionally refinement from ne30 to ne120 over the Tropical Western Pacific region. All needed input files to run the new configuration have been uploaded to the input data repository. Supported compsets include only FC5AV1C-04P2 (72 levels).

Fixes #1357

[BFB]

Conflicts:
	cime/src/drivers/mct/cime_config/config_component_acme.xml
	components/cam/bld/config_files/horiz_grid.xml
	components/cam/bld/namelist_files/namelist_defaults_cam.xml
	components/clm/bld/namelist_files/namelist_defaults_clm4_5.xml
	components/clm/bld/namelist_files/namelist_definition_clm4_5.xml
jgfouca pushed a commit that referenced this pull request Mar 14, 2018
Set RTM grid to null in grid longname for RRM configurations to turn off
the river runoff model. Required input files do not exist for running
RTM with the RRM grids, so previously this required manually setting
RTM_MODE=NULL for RRM grids. Once PR #1629 are #1693 merged as well, the
supported RRM grids (CONUS, TWP, and ENA) should run without further
user intervention. For now, the user still needs to set CAM_DYCORE=se,
due to issue #1357.
jgfouca pushed a commit that referenced this pull request Mar 14, 2018
Add configuration for new atmosphere and land grid with regionally refinement from ne30 to ne120 over the Tropical Western Pacific region. All needed input files to run the new configuration have been uploaded to the input data repository. Supported compsets include only FC5AV1C-04P2 (72 levels).

Fixes #1357

[BFB]

Conflicts:
	cime/src/drivers/mct/cime_config/config_component_acme.xml
	components/cam/bld/config_files/horiz_grid.xml
	components/cam/bld/namelist_files/namelist_defaults_cam.xml
	components/clm/bld/namelist_files/namelist_defaults_clm4_5.xml
	components/clm/bld/namelist_files/namelist_definition_clm4_5.xml
rljacob pushed a commit that referenced this pull request Apr 12, 2021
Add configuration for new atmosphere and land grid with regionally refinement from ne30 to ne120 over the Tropical Western Pacific region. All needed input files to run the new configuration have been uploaded to the input data repository. Supported compsets include only FC5AV1C-04P2 (72 levels).

Fixes #1357

[BFB]

Conflicts:
	cime/src/drivers/mct/cime_config/config_component_acme.xml
	components/cam/bld/config_files/horiz_grid.xml
	components/cam/bld/namelist_files/namelist_defaults_cam.xml
	components/clm/bld/namelist_files/namelist_defaults_clm4_5.xml
	components/clm/bld/namelist_files/namelist_definition_clm4_5.xml
rljacob pushed a commit that referenced this pull request Apr 12, 2021
Add configuration for new atmosphere and land grid with regionally refinement from ne30 to ne120 over the Tropical Western Pacific region. All needed input files to run the new configuration have been uploaded to the input data repository. Supported compsets include only FC5AV1C-04P2 (72 levels).

Fixes #1357

[BFB]

Conflicts:
	cime/src/drivers/mct/cime_config/config_component_acme.xml
	components/cam/bld/config_files/horiz_grid.xml
	components/cam/bld/namelist_files/namelist_defaults_cam.xml
	components/clm/bld/namelist_files/namelist_defaults_clm4_5.xml
	components/clm/bld/namelist_files/namelist_definition_clm4_5.xml
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants