Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

patch fusion fix to prevent patch heterogeneity collapse #553

Merged
merged 1 commit into from
Jul 25, 2019

Conversation

ckoven
Copy link
Contributor

@ckoven ckoven commented Jul 16, 2019

This code slightly changes the patch fusion code to prevent an edge case that has been happening fairly frequently, whereby the patches would all get fused, thereby losing all the heterogeneity.

Description:

basically this just resets the patch fusion tolerance after every fusion event, so that if the code had encountered a situation where it needed to fuse two patches that were dissimilar enough that the worst biomass bin comparison between the two patches was that one patch had some biomass in a given bin and the other didn't, the looping structure would then proceed to fuse all of the patches. now that shouldn't happen.

note this also changes the outcome of inventory initialization. earlier code was tending to collapse all the quadrats at BCI into a single patch. This code leaves several patches after the initial round of patch fusion during inventory initialization using the BCI census data.

fixes #323.

Collaborators:

some email discussion with @rgknox and @rosiealice.

Expectation of Answer Changes:

this should change answers, but possibly only for long runs.

Checklist:

  • My change requires a change to the documentation.
  • I have updated the in-code documentation .AND. (the technical note .OR. the wiki) accordingly.
  • I have read the CONTRIBUTING document.
  • FATES PASS/FAIL regression tests were run
  • [SORT OF] If answers were expected to change, evaluation was performed and provided . I write "sort of" because I actually did the evaluation on e27deac, not db47637. the current PR (db47637) is the same logic but applied to master (and also with some added documentation and cleaned up documentation earlier in the code block).

Test Results:

CTSM (or) E3SM (specify which) test hash-tag:

CTSM (or) E3SM (specify which) baseline hash-tag:

FATES baseline hash-tag:

Test Output:

@rosiealice
Copy link
Contributor

Looks good to me...

@glemieux glemieux self-assigned this Jul 17, 2019
Copy link
Contributor

@rgknox rgknox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes look great.

@glemieux
Copy link
Contributor

@ckoven, per our conversation, I ran the regression tests. All expected pass with the exception of some DIFFs:

FAIL ERS_D_Ld5.1x1_brazil.I2000Clm50FatesCruGs.cheyenne_intel.clm-FatesHydro RUN time=24
    FAIL ERS_Ld60.f45_f45_mg37.I2000Clm50FatesCruGs.cheyenne_intel.clm-FatesNoFire MEMCOMP Error: Memory usage increase > 10% from baseline
    FAIL SMS_Lm13.1x1_brazil.I2000Clm50FatesCruGs.cheyenne_intel.clm-FatesColdDef BASELINE ckoven-patch_fusion_fix.fates.cheyenne.intel.C08fc9481-F2bd9cb46: DIFF
    FAIL SMS_Lm3_D_Mmpi-serial.1x1_brazil.I2000Clm50FatesCruGs.cheyenne_intel.clm-FatesHydro RUN time=21
    FAIL SMS_Lm6.f45_f45_mg37.I2000Clm50FatesCruGs.cheyenne_intel.clm-Fates BASELINE ckoven-patch_fusion_fix.fates.cheyenne.intel.C08fc9481-F2bd9cb46: DIFF

Comparison files can be found here: /glade/u/home/glemieux/scratch/clmed-tests/ckoven-patch_fusion_fix.fates.cheyenne.intel.C08fc9481-Fdb476373

@rgknox
Copy link
Contributor

rgknox commented Jul 25, 2019

I think those results make sense @glemieux . The changes to the code should generate answer changes. Likewise I'm not surprised that it is only generating differences in the longer runs, since they use the fusion algorithms more, and are more likely to have larger differences in members that are being fused.

@ckoven
Copy link
Contributor Author

ckoven commented Jul 25, 2019

yeah I agree. I'm slightly surprised to see changes in a 3 month single poiint run, but I guess that just means the problem may have occurred more often than I thought.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

episodic patch heterogeneity collapses during long runs
4 participants