Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

always save the latest lognl in the demargin hdf #4900

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

WuShichao
Copy link
Contributor

Standard information about the request

This is a: bug fix,

This change affects: inference

This change changes: result presentation / plotting, scientific output

This change will: nothing

Motivation

There is a precision issue in lognl calculation, during reconstruct process, model will get slightly different lognl and loglr values (so loglikelihood). This becomes a problem when loglr is very small compared to lognl, which numerical precision is domainted by lognl. If still use the original lognl before the reconstruct process, pycbc_inference_plot_posterior will get wrong loglr/snr for plotting.

Contents

Save the latest lognl into the demargin hdf file.

  • The author of this pull request confirms they will adhere to the code of conduct

Copy link
Member

@ahnitz ahnitz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WuShichao See comments. I thinn tk probably the right place for changes here is actually on the model side.

@cdcapano Thoughts?

def callmodel(arg):
iteration, paramvals = arg
# calculate the logposterior to get all stats populated
model.update(**{p: paramvals[p] for p in model.variable_params})
_ = model.logposterior
stats = model.get_current_stats()
if hasattr(model, 'submodels'):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WuShichao Wouldn't it make more sense for heirarchical models to make it so that get_current_stats includes any submodels rather than hardcoding knowledge about submodels into this script? It's best to keep information compartmentalized and use clearer interfaces to avoid technical debt in the future and prevent flexibility.

Similarly, why do we need to hardnode anything about lognl here? Maybe it should just be included in the stats when it is a fixed number like this, no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The lognl issue is not the hierarchical-model-only problem; whenever users use this script and lognl is much larger than loglr, the same problem will occur. So instead of modifying all likelihood models, I only change in one place. The reason why I hardcode lognl is I want to match the keys in the original PE file.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WuShichao (1) You are explicitly checking through submodels here. That really should be handled by the hierarchical model itself.

(2) lognl should probably just be a model stat if it is a constant. No modifications are then needed here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: In Progress
Development

Successfully merging this pull request may close these issues.

2 participants