-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Marginalapprox fix #6076
Marginalapprox fix #6076
Conversation
Codecov Report
@@ Coverage Diff @@
## main #6076 +/- ##
==========================================
+ Coverage 89.52% 89.53% +0.01%
==========================================
Files 72 72
Lines 12950 12948 -2
==========================================
Hits 11593 11593
+ Misses 1357 1355 -2
|
thanks @canyon289! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does this address #5024 as well?
Looks like it does 💯
pymc/gp/gp.py
Outdated
**kwargs, | ||
) | ||
approx_logp = self._build_marginal_likelihood_logp(y, X, Xu, noise, JITTER_DEFAULT) | ||
pm.Potential(f"marginalapprox_logp_{name}", approx_logp) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the f"{name}_loglikelihood"
, right?
Just trying to find something more instructuve than "logp"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch! yeah thanks it's not even correct because it is the log likelihood and not logp... fixed
pymc/tests/test_gp.py
Outdated
self.gp = gp | ||
self.sigma = 0.1 | ||
self.x = np.linspace(-5, 5, 30) | ||
self.y = 0.25 * self.x + self.sigma * np.random.randn(len(self.x)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.y = 0.25 * self.x + self.sigma * np.random.randn(len(self.x)) | |
self.y = np.random.normal(0.25 * self.x, self.sigma) |
It does address #5024, but I left it off since "ideally" you wouldn't use a potential here. If I could get something like DensityDist working, you would then include a |
This PR is meant to fix #5922 and possibly also #6069, but not 100% sure about the last one.
Ended up having to use
pm.Potential
instead ofpm.DenisityDist
to getpm.MarginalApprox.marginal_likelihood
working properly. Usingpm.DensityDist
it seems would involve introspecting the lazily evaluated covariance and mean functions for any random variables and their dims.I also tried to clean up the tests a bit for
MarginalApprox
, because the reason #5922 took a while to be caught was becausefind_MAP
or other any other methods weren't tested. #6069 does look to just be due to tolerances -- the results still look correct so I increased those a bit.Hopefully this PR can close #6069 and close #5922.