-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Runs with MPAS-A dycore and CAM7 physics fail - missing variables in inic files #995
Comments
Can you confirm whether this occurs with F2000climo a.k.a. CAM6 physics? Are these runs with ./xmlchange DEBUG=TRUE? Thanks. |
Hi @adamrher, I can confirm that F2000climo works. I was testing the RRTMGP changes in CAM with MPAS-A, and I was able to run with F2000climo. I have not tried with |
Here's one thread's content in cesm.log from a run with
From a run on Derecho within "/glade/derecho/scratch/gdicker/F2000dev_mpasa120_intel_dbg_1710436541" |
Is this just a problem with the IC file? I've run this with my own analytic IC files and cam_dev physics before. I think it just needs those two missing fields (cell_gradient_coef_x and cell_gradient_coef_y). |
As a temporary workaround, if testing without the frontogenesis gravity wave drag (?) scheme is acceptable, setting |
Thanks @briandobbins and @mgduda for the tips.
It might be. I think only "atm/cam/inic/mpas/mpasa60_L32_notopo_coords_c230707.nc" has
I just tried a couple of these F2000dev MPAS-A runs with |
This was off in CAM6, so it's not terrible to omit this process in the near term. But this should get fixed for production runs as our midlatitude jets and polar vortex are too strong, and so the additional drag caused by turning the frontal scheme on does move the solution in the right direction. This is less important at higher resolutions where these waves start to become resolved. |
@gdicker1 if this issue is just due to missing variables in the inic file when running the frontal scheme, should we close (or rename) this issue? |
If the issue isn't fixed, I'm not sure why it should be closed. Unless someone has regenerated the files already? @adamrher I think the issue title was fine but I changed it to "Runs with MPAS-A dycore and CAM7 physics fail - missing variables in inic files." If that still isn't what you imagined, I don't mind if the title changes again. |
@gdicker1 understood. You're right, the original name still conveyed this issue. I was just confused since folks have been running cam_dev with MPAS for a while now, but the issue is that our namelist_defaults have a large number of inic without the variables req'd to run cam_dev. |
Hi @gdicker1. I was looking through the issues and we don't have a general issue for bringing in L58/L93 support for mpas. This issue here is related, but not encompassing of the entire effort, which now includes this issue: #1102. I was going to open the issue but wanted to check with you first. Only mpasa120 and mpasa480 are supported in cam_development. So I was thinking the issue could just provide support for those two grids -- hi-res and var-res can be a separate issue that we can address after supporting the coarser grids. Thoughts? |
Hi @adamrher, thanks for checking. I think this sounds reasonable, especially to add other resolutions later. Just to add some other thoughts: Other times this has come up there wasn't agreement on what the level heights should be for L58 and L93 (but I think this has been resolved). There has also been concerns about the amount of space the (high-resolution) files could take up on CESM data servers, especially since we could have with 3 versions of a similar grid (notopo, topo, and real-data). |
Short term, let's get all the 120km cases done - space isn't much of a
concern there, and since it's the workhorse resolution, and the one likely
to be 'tested' the most, the value of having things work out of the box is
big.
Longer term, for high-resolution cases, I've got some discussions going on
with CISL about moving our input storage (and merging the EarthWorks & CESM
datasets) on to new infrastructure that's got more, and scalable, space.
Cheers,
- Brian
…On Wed, Jul 24, 2024 at 2:04 PM G. Dylan Dickerson ***@***.***> wrote:
Hi @adamrher <https://github.com/adamrher>, thanks for checking. I think
this sounds reasonable, especially to add other resolutions later.
Just to add some other thoughts: Other times this has come up there wasn't
agreement on what the level heights should be for L58 and L93 (but I think
this has been resolved). There has also been concerns about the amount of
space the (high-resolution) files could take up on CESM data servers,
especially since we could have with 3 versions of a similar grid (notopo,
topo, and real-data).
—
Reply to this email directly, view it on GitHub
<#995 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACL2HPNFD7GLDL4MWBWDCY3ZOACGVAVCNFSM6AAAAABEWP5RJGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENBYHAYDIOBZGU>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
What happened?
Runs of the F2000dev compset on MPAS-A grids fail. This seems to be due to the combination of the MPAS-A dycore and CAM7 (a.k.a. cam_dev) physics.
The last output from a case's atm.log:
Last output from cesm.log (reorganized for 1 thread):
What are the steps to reproduce the bug?
The easiest is to create a case with
--compset F2000dev
to get cam_dev physics and--res mpasa120_mpasa120
to get the MPAS-A dycore. After setting up, building, and submitting the case the run will fail.E.g. on Derecho:
What CAM tag were you using?
cam6_3_148
What machine were you running CAM on?
CISL machine (e.g. cheyenne)
What compiler were you using?
Intel
Path to a case directory, if applicable
/glade/derecho/scratch/gdicker/F2000dev_mpasa120_intel_1710435350
Will you be addressing this bug yourself?
Any CAM SE can do this
Extra info
No response
The text was updated successfully, but these errors were encountered: