-
Notifications
You must be signed in to change notification settings - Fork 250
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Output cubed-sphere grid in a single file instead of individual tiles. #827
Comments
Here is a header dump showing how the file needs to be structured to be read by JEDI & Panoply: @aerorahul mentioned Panoply, this is a phenomenal utility that can be used to make plots directly from the cube sphere fields. Here's what that interface looks like: I've put an example file here: https://drive.google.com/file/d/1JPN_Far-vGNeM8Z1nheH1KTz1WzCtR1E/view?usp=sharing (requires noaa account to access) in case you want to look more or play around with Panoply. |
@aerorahul @danholdaway Dusan added a capability for ufs-weather-model to output one netcdf file with 6 tiles. Please let us know if you want to try some sample files. |
Adding @CoryMartin-NOAA as he will likely try using these. |
As the variable and dimension names, etc. are different between UFS and GEOS, this will require code changes in FV3-JEDI before being able to properly test. But thank you, @DusanJovic-NOAA , for adding these, they will be very useful going forward and likely what we will want to use in future GDAS implementations. |
@C
If the naming of variables and dimensions are the only thing that are different, we should be able to refactor existing code to handle that distinction rather than writing new, largely similar code. Something we should work with @danholdaway |
@aerorahul agreed, I have already talked to @danholdaway briefly about this and we think we can refactor the GEOS I/O in FV3-JEDI to be more generic to handle UFS and GEOS netCDF history files. There can be options for 'history' and 'FMS RESTART' instead of 'GEOS' and 'GFS' as it currently is. I will take the lead on this refactoring post-AMS annual meeting. |
@aerorahul @CoryMartin-NOAA The output variable names are controlled by diag_table, you can change the output name (the third column) to set the same name as GEOS. We will need to check the impact of dimension names though. |
Are we talking about the 'cubed_sphere_grid' option for 'output_grid' in model_configure? If that's going to be used by JEDI we should probably add a 'netcdf_parallel' option, and compression. We may also need to add the option for the model to read native grid increments (instead of just gaussian grid). EDIT: regarding parallel write, I see from the comments in module_wrt_grid_comp.F90 that this is already supported
|
@jswhit2 Re. reading native grid increments (instead of just Gaussian grid), Yes! We need that capability for initializing the model with IAU. FV3-JEDI/JEDI in general needs an update to be able to write out increments (instead of analysis) after solve. |
Yes. For example this 'model_configure' options:
will create sequence of |
@aerorahul The variable names are not an issue as they are controlled by the run time output configuration file diag_table. @DusanJovic-NOAA Do you have any sample file from the model_configuration above? |
@junwang-noaa presumably offline post/bufr soundings, etc. expect these current variable names, so I think we still would want JEDI to be consistent with the rest of UFS and not change variable names in the |
See: These files correspond with: |
FV3-JEDI IO code makes no reference to specific field names so there wouldn't be any benefit to picking something different from what is already used for GFS. We can make the dimension names configurable instead of hardcoded to what GEOS uses. @CoryMartin-NOAA when you're ready to start refactoring the fv3-jedi code lets have a quick chat as there are some things about that code that are a little too complex and we can simplify somewhat. FYI @jswhit2 the increments would normally be lower CS resolution so some thought might be needed about how best to ingest. You could either interpolate/remap to the native resolution in an extra workflow step and then have just a simple read/iau. Otherwise you would need to be able to create another low res cubed sphere grid within the increment read routine of UFS and interpolate/remap inside before iau. |
Corry made a good point. We have products used by forecasters and the
community. Changing variable/product names is a laboring process.
…On Fri, Jan 14, 2022 at 12:19 PM Cory Martin ***@***.***> wrote:
@junwang-noaa <https://github.com/junwang-noaa> presumably offline
post/bufr soundings, etc. expect these current variable names, so I think
we still would want JEDI to be consistent with the rest of UFS and not
change variable names in the diag_table to match GEOS. There are other
things like the time variable/units that differ between UFS and GEOS that
will require FV3-JEDI changes.
—
Reply to this email directly, view it on GitHub
<#827 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKY5N2MPH5CUMF6ZY2Z46FTUWBLIHANCNFSM5EV4XFSQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
--
*Fanglin Yang, Ph.D.*
*Chief, Model Physics Group*
*Modeling and Data Assimilation Branch*
*NOAA/NWS/NCEP Environmental Modeling Center*
*https://www.emc.ncep.noaa.gov/gmb/wx24fy/fyang/
<https://www.emc.ncep.noaa.gov/gmb/wx24fy/fyang/>*
|
Looping @pjpegion into this discussion, since he wrote the increment reading/interpolation code for FV3 |
Maybe I misunderstood the questions. I am suggesting name changes in diag_table for a quick test code to confirm the new file working in fv3_JEDI, it's just for this PR. |
@junwang-noaa got it, yeah that won't really work because as @danholdaway said the variables themselves aren't so much the problem, it's the dimension names and the time variable/attributes (I believe). |
Expecting that UFS and GEOS create history output files with identical metadata is probably unrealistic, hence JEDI will need to be able to handle them differently, at least at some level. |
@CoryMartin-NOAA If more work needs to be done in JEDI side, we will plan to get this PR committed. We can make further change in the model side if it is required when JEDI is testing those files. |
@junwang-noaa that works for me. I'll be sure to let you all know if I run into any issues and if anything needs modified. A quick spot check suggests this file format looks the same as what we now use for GSI, just an extra dimension for tiles (and a different grid of course). |
Description
Currently, when
ouptut_grid = 'cubed_sphere_grid
inmodel_configure
, the forecast output is produced in individual tile files such asatmf000.tile1.nc
,atmf000.tile2.nc
, and so on.The JEDI system would like to read this output for backgrounds instead of the restarts as:
tile
dimension will allow storage, handling and IO in JEDI a lot more simplerak
,bk
, etc.Solution
Provide an option to produce cubed-sphere forecast output in a single file instead of tiles with a
tile
dimension.The output will then be a single
atmf000.nc
andsfcf000.nc
file containing all tiles.Alternatives
Anything offline is a possibility, but that is going to add a workflow step in the cycled DA application.
Tagging @danholdaway to provide a sample output of single file containing all tiles from GEOS with panoply attributes.
The text was updated successfully, but these errors were encountered: