Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reinstated sea ice option in RUC LSM. #57

Closed
wants to merge 13 commits into from

Conversation

tanyasmirnova
Copy link
Collaborator

@tanyasmirnova tanyasmirnova commented Oct 16, 2020

Additional CALL in LSMRUC is added for sea ice. Now RUC LSM solves energy and moisture budgets for both land and sea ice. The vertical dimension in sea ice is 9 levels, same as for the land.

Associated PRs

#57 (contained in #63)
#63
NOAA-GSL/fv3atm#54 (contained in NOAA-GSL/fv3atm#56)
NOAA-GSL/fv3atm#56
NOAA-GSL/ufs-weather-model#47

For regression testing information, see NOAA-GSL/ufs-weather-model#47.

physics/sfc_drv_ruc.F90 Outdated Show resolved Hide resolved
@DomHeinzeller
Copy link

@tanyasmirnova I am seeing lot of regression tests failing with these changes, for example for fv3_ccpp_gsd_repro

  0:  in fcst run phase 2, na=           0
131: forrtl: severe (174): SIGSEGV, segmentation fault occurred
131: Image              PC                Routine            Line        Source
131: fv3.exe            0000000003806A0C  Unknown               Unknown  Unknown
131: libpthread-2.17.s  00002ABCBD33A630  Unknown               Unknown  Unknown
131: fv3.exe            00000000029F47D7  module_mp_thompso        3759  module_mp_thompson.F90
131: fv3.exe            00000000029DF506  module_mp_thompso        1266  module_mp_thompson.F90
131: fv3.exe            0000000002A0C1DF  mp_thompson_mp_mp         585  mp_thompson.F90
131: fv3.exe            00000000028758B2  ccpp_fv3_gsd_v0_p        2239  ccpp_FV3_GSD_v0_physics_cap.F90
131: fv3.exe            00000000024FD744  ccpp_static_api_m         327  ccpp_static_api.F90
131: fv3.exe            0000000001B0B7E4  ccpp_driver_mp_cc         152  CCPP_driver.F90
131: libiomp5.so        00002ABCBB743A43  __kmp_invoke_micr     Unknown  Unknown

I think this has to do with the fact that you are using arrays like sfalb_lnd and sfalb_ice instead of the existing sfalb (similar for many other arrays). But existing physics rely on the existing arrays to be filled with reasonable data. For the particular example of sfalb, I see that you are resetting these arrays completely every time RUC LSM is called, so there is no need to add them to GFS_typedefs.* and the RUC LSM metadata - just make it local variables. But they need to be combined properly, either in RUC LSM or in GFS_surface_composites.* depending on what happens in other surface physics. I guess we have more work to do ...

@DomHeinzeller
Copy link

@tanyasmirnova I am seeing lot of regression tests failing with these changes, for example for fv3_ccpp_gsd_repro

  0:  in fcst run phase 2, na=           0
131: forrtl: severe (174): SIGSEGV, segmentation fault occurred
131: Image              PC                Routine            Line        Source
131: fv3.exe            0000000003806A0C  Unknown               Unknown  Unknown
131: libpthread-2.17.s  00002ABCBD33A630  Unknown               Unknown  Unknown
131: fv3.exe            00000000029F47D7  module_mp_thompso        3759  module_mp_thompson.F90
131: fv3.exe            00000000029DF506  module_mp_thompso        1266  module_mp_thompson.F90
131: fv3.exe            0000000002A0C1DF  mp_thompson_mp_mp         585  mp_thompson.F90
131: fv3.exe            00000000028758B2  ccpp_fv3_gsd_v0_p        2239  ccpp_FV3_GSD_v0_physics_cap.F90
131: fv3.exe            00000000024FD744  ccpp_static_api_m         327  ccpp_static_api.F90
131: fv3.exe            0000000001B0B7E4  ccpp_driver_mp_cc         152  CCPP_driver.F90
131: libiomp5.so        00002ABCBB743A43  __kmp_invoke_micr     Unknown  Unknown

I think this has to do with the fact that you are using arrays like sfalb_lnd and sfalb_ice instead of the existing sfalb (similar for many other arrays). But existing physics rely on the existing arrays to be filled with reasonable data. For the particular example of sfalb, I see that you are resetting these arrays completely every time RUC LSM is called, so there is no need to add them to GFS_typedefs.* and the RUC LSM metadata - just make it local variables. But they need to be combined properly, either in RUC LSM or in GFS_surface_composites.* depending on what happens in other surface physics. I guess we have more work to do ...

@tanyasmirnova I will be working on this today.

@tanyasmirnova
Copy link
Collaborator Author

@DomHeinzeller Thank you very much, Dom. Hera is down today, but I also will review the code on Jet. Let's keep in touch.

@tanyasmirnova
Copy link
Collaborator Author

@DomHeinzeller Dom, I think you are right, that the problem could be related to albedo. Because we recompute land and ice albedos every time step from (alvwf(i) + alnwf(i)) and snow cover fraction they could be declared as local arrays. Do you agree?
The reason why I am not using sfalb is because it is a composite of land, water and snow that is computed in setalb in radiation_surface.f
sfcalb(i,1) = ab1bm flnd + asenbfsea + asnnbfsno
sfcalb(i,2) = alnwf(i) flnd + asendfsea + asnnd
fsno
sfcalb(i,3) = ab2bm flnd + asevbfsea + asnvbfsno
sfcalb(i,4) = alvwf(i) flnd + asevdfsea + asnvd
fsno
Also, setalb perturbs them using the old method, and with Clara's method alvwf(i) and alnwf(i) will be perturbed in stochastic physics.

@tanyasmirnova
Copy link
Collaborator Author

@DomHeinzeller I wonder also about the suite definition. Should we have lsm_ruc_sfc_sice in there?

@climbfuji
Copy link

@tanyasmirnova thanks for the updates, but unfortunately I am already working on this code and having overlapping changes makes it difficult and costs extra time. It's fine if you want to keep working on this, but then I'll do something else and wait until you are finished.

@tanyasmirnova
Copy link
Collaborator Author

tanyasmirnova commented Nov 4, 2020 via email

@DomHeinzeller
Copy link

Dom, Again we have this miscommunication. I presumed you are still not working on it as I did not hear from you. I know now how to revert the change. I will do it. Sorry, Tanya

On Wed, Nov 4, 2020 at 1:57 PM Dom Heinzeller @.***> wrote: @tanyasmirnova https://github.com/tanyasmirnova thanks for the updates, but unfortunately I am already working on this code and having overlapping changes makes it difficult and costs extra time. It's fine if you want to keep working on this, but then I'll do something else and wait until you are finished. — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#57 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJHANABIOUSVXEMREOAXOLDSOG52VANCNFSM4ST4FJRA .

Keep it, please. I am merging mine with yours.

@tanyasmirnova
Copy link
Collaborator Author

tanyasmirnova commented Nov 4, 2020 via email

@DomHeinzeller DomHeinzeller added bug Something isn't working do not merge Something is wrong, do not merge labels Nov 24, 2020
@DomHeinzeller
Copy link

This PR is superseded by #63 and will be merged automatically as part of it.

@DomHeinzeller
Copy link

Replaced by #65

DomHeinzeller added a commit that referenced this pull request Dec 1, 2020
RUC ice for gsl/develop (replaces #57 and #63)
zhanglikate pushed a commit to zhanglikate/ccpp-physics that referenced this pull request Mar 1, 2024
zhanglikate pushed a commit to zhanglikate/ccpp-physics that referenced this pull request Mar 1, 2024
Correction to prog closure (convection) and ugwp stoch phys fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working do not merge Something is wrong, do not merge
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants