-
Notifications
You must be signed in to change notification settings - Fork 313
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ice shelf wetland fix in mksurfdata_map can lead to glacier+lake > 100% on surface datasets #673
Comments
Some findings from a similar failure from @fischer-ncar . Note that @fischer-ncar generated the surface dataset from release-clm5.0.18, and ran with ctsm1.0.30; he found that the error went away if he generated the surface dataset from ctsm1.0.30 (which does not have the Antarctica wetlands fix): This death occurs before the finidat file is read, it doesn't seem we can blame the finidat file. I suggested running with changes like the following (adjusting the gridcell index appropriately): diff --git a/src/main/subgridWeightsMod.F90 b/src/main/subgridWeightsMod.F90
index b2c882a4..0c38afed 100644
--- a/src/main/subgridWeightsMod.F90
+++ b/src/main/subgridWeightsMod.F90
@@ -657,6 +657,10 @@ subroutine check_weights (bounds, active_only)
do l = bounds%begl,bounds%endl
g = lun%gridcell(l)
+ if (g == 3838) then
+ write(iulog,*) 'WJS: g=3838: ', grc%latdeg(g), grc%londeg(g), grc%area(g)
+ write(iulog,*) 'WJS: ', l, lun%itype(l), lun%wtgcell(l), lun%active(l)
+ end if
if ((active_only .and. lun%active(l)) .or. .not. active_only) then
sumwtgcell(g) = sumwtgcell(g) + lun%wtgcell(l)
end if @fischer-ncar did this and got the following output:
So this is a grid cell in Antarctica, and the surface dataset says that it has 100% glacier and 45% lake in that grid cell. I have confirmed that the issue exists on the surface dataset – so this is a problem in the creation of the surface dataset, not in the CTSM code: In [1]: surf = Dataset('/gpfs/fs1/work/fischer/code/release-clm5.0.18_fsurfdata_map/tools/mksurfdata_map/surfdata_ne0CONUSne30x8_hist_16pfts_Irrig_CMIP6_simyr2000_c190328.nc')
In [3]: np.where(np.abs(surf.variables['LONGXY'][:] - 313.420729004390) < 0.0001)
Out[3]: (array([43665]),)
In [4]: np.where(np.abs(surf.variables['LATIXY'][:] + 81.3524976604657) < 0.0001)
Out[4]: (array([43665]),)
In [5]: g = 43665
In [6]: surf.variables['PCT_LAKE'][g]
Out[6]: 45.0
In [7]: surf.variables['PCT_GLACIER'][g]
Out[7]: 100.0 From eyeballing the raw data files, I think they have lake in the given area (though only for a small number of 3' grid cells), while having 100% or close to 100% glacier. But I can't understand why: (1) mksurfdata_map failed to renormalize these % coverages to equal 100% (2) mksurfdata_map failed to catch this problem of pct_glacier + pct_lake > 100% (3) CTSM's surfrdMod failed to catch this problem of pct_glacier + pct_lake > 100% (which should have triggered an error prior to the more cryptic error you got) |
I've generated a new fsurdat file using release-clm5.0.20 for ne120np4. I'm getting a similar error. When I switch to an older fsurdat, I don't get the error. 371: check_weights ERROR: at g = 492658 total PFT weight is My new fsurdat file is. |
I'm not sure if this is the source of the problem, but it seems like there could be a problem with this line: If pctgla + pctlak > 100, we'd end up with pctwet negative, which probably wouldn't be handled properly in following code. |
Ah, yes, I'm pretty sure that's exactly the source of the problem, based on looking back at my earlier comments and checking PCT_WETLAND (which I should have checked before):
This would explain why the other normalizations and error checks didn't catch the problem. |
I'm not sure what the correct fix here. My first inclination was to simply change: pctwet(n) = 100._r8 - pctgla(n) - pctlak(n) to: pctwet(n) = max(100._r8 - pctgla(n) - pctlak(n), 0._r8) However, I think that (along with the original code) wouldn't be quite right in the case where pctwet(n) was greater than 0 to begin with. For example, imagine that, going into this block of code, we had pctgla = 50, pctlak = 50 and pctwet = 50, for a point outside the pftdata mask. Then, upon execution of this line, we'd end up with pctwet = 0, which seems wrong. So maybe the rule should be that we shouldn't ever decrease pctwet as a result of this logic. So we'd have: ! Anything that isn't already lake or glacier will be considered wetland. But don't allow a decrease in any already-existing wetland area due to this adjustment.
pctwet(n) = max(100._r8 - pctgla(n) - pctlak(n), pctwet(n)) (It should be okay for pctwet+pctgla+pctlak to be something other than 100 at this point: the renormalizations later should sort that out, I think.) |
A general thing we should do is to check for PCT of types being negative, and if so handle appropriately. |
Actually, another possible fix would be to say: The real intent here is to make sure that there is some landunit present anywhere. If there is already some glacier or lake, we shouldn't need to do anything with wetland (just like, I believe: If we have anything more than 1e-6 pct vegetation, we don't do anything with wetland). Then we would replace this: with something like: if (pctgla(n) < 1.e-6_r8 .and. pctlak(n) < 1.e-6_r8) then
pctgla(n) = 0._r8
pctlak(n) = 0._r8
pctwet(n) = 100._r8
end if (There is no |
Feeling from today's discussion: On release: pctwet(n) = max(100._r8 - pctgla(n) - pctlak(n), 0._r8) and check to make sure it doesn't change any of our existing surface datasets. On master, go with something like: if (pctgla(n) < 1.e-6_r8 .and. pctlak(n) < 1.e-6_r8) then
pctgla(n) = 0._r8
pctlak(n) = 0._r8
pctwet(n) = 100._r8
end if (So wetland should only be 0 or 100 if you're using no_inlandwet.) |
I added a check for negative percent types and found that most of the high resolution cases have this problem while mid or low resolution do not. So ne60np4, 360x720cru, 0.125x0.125, f05, f02 as well as conus30x8, ne120, ne240 as we already know fail. But f09, f19, f10, f45 are all fine for example. |
… bad with negative pctwet (issue ESCOMP#673)
Fix a couple small issues. Correct end year for ndep for SSP's so can run to the end of 2100. Some fixes to mksurfdata_map for high resolution surface datasets. Have 2-degree WACCM-CMIP6DECK match a user-mod directory without carbon isotopes on. Remove the ne120np4 and conus_30_x8 surface dataset files, as they can't be used (see ESCOMP#673). Remove 8x16, 32x64 resolutions as they are no longer needed and there are problems with them. Add in the mapping files needed for 94x192. Check that special landunit percent area's is not less than 0.0, and don't let PCT_WET be less than zero for areas with ocean (see ESCOMP#673). Change some of the longer single point tests to use Qian forcing (as faster, less memory, less problems). Add compsets for this. This change was also done on master.
None of these resolution are on the release branch, so the release branch is OK. |
@ekluzek are you still planning to put in place the one-line fix noted above for the release branch so that it will work to create high-res surface datasets from the release branch in the future? |
Actually this is already put in place. Just looks like the issue wasn't referenced. This commit: 027faaf and release-clm5.0.24 |
We talked about having a different fix for this on master than on the release branch. The problem is that the suggested fix for master doesn't work. So I end up doing something that's functionally equivalent to what's on the release branch. So I'll bring the release branch version to master and we should decide if we really do want different behavior on master than on the release branch. |
…proposed in ESCOMP#673 for master didn't work
@ekluzek when you say that the suggested fix for master doesn't work: can you please describe in what way it doesn't work? Looking at the code that was in place before you reverted things in 162c35a: It doesn't look like the previous code correctly captured the above plan: In particular, as noted in #673 (comment), there should not have been an |
Also, I wonder if your problems related to this comment: It doesn't look like that was addressed in the changes in your commit. I thought it best to open a new issue to capture the remaining changes needed, since these were spread across both this issue and #553 . If you agree, let's move further discussion to the new issue - #1001 . |
I'll move more discussion over to #1001. But, to complete this here. The problem I ran into was that my first try at what I understood was asked for -- resulted in mksurfdata_map dying. Some refinements I tried, ran but gave identical answers as the release code version. So I gave up and left it identical to the release branch version. You are right I didn't try moving the truncation of the percent fields. That would likely make a difference. I suppose that might be the right thing to do here. |
Brief summary of bug
Creating a conus_30_x8 surface dataset and then trying to run with it doesn't work.
General bug information
**CTSM version you are using:**release-clm5.0.20
Does this bug cause significantly incorrect results in the model's science? No
Configurations affected: conus_30_x8
Details of bug
Important details of your setup / configuration so we can reproduce the bug
Setup a case for ne30...
SMS_Ld1_D.ne30_g17.I1850Clm50BgcCrop.cheyenne_intel.clm-default
That case will fail because it needs a new ne30np4 surface dataset. Pointing to a new one it then works.
Here's some changes to get it setup for the new grid..
Add this to user_nl_clm...
fsurdat = '/gpfs/fs1/scratch/erik/release-clm5.0.20/tools/mksurfdata_map/surfdata_conus_30_x8_hist_78pfts_CMIP6_simyr1850_c190402.nc'
Since, it dies early on, there probably aren't problems with the above mapping files to cause the problem.
Important output or errors that show the problem
Here's the error from the cesm.log file:
The text was updated successfully, but these errors were encountered: