-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bring back sampler argument target_accept #5622
Conversation
Codecov Report
@@ Coverage Diff @@
## main #5622 +/- ##
===========================================
- Coverage 87.67% 69.40% -18.27%
===========================================
Files 76 76
Lines 13717 13724 +7
===========================================
- Hits 12026 9525 -2501
- Misses 1691 4199 +2508
|
Can we add a regression test. Perhaps a short sampling and the checking the sample_stats target_accept? |
We have a test with the argument pymc/pymc/tests/test_quadpotential.py Line 153 in c3ffec8
Lines 2262 to 2263 in 849a64d
|
I had opened an issue about that long time ago: #4291 I would say we raise if it's not a string |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would recommend to bring back this entire section: 2dd4c8c#diff-7b3470589153c7c90a2c05fc86402f9acf65555200d90f01b51a47ccef592628L484-L506
The reason is mostly that before attempting to do a at.grad
there's no way to know if the model is differentiable. The old try/catch
logic took care of that.
a7e66c4
to
1b5a666
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should merge this soon, but is there a test that ensures we are doing the automatic init?
We should release immediately after this gets in. @aloctavodia Did you confirm we don't do any useful NUTS init when we get a default CompoundStep? If so, we should open an issue for that |
Agreed, and yes we should open an issue. |
Do you mind doing that, as you have more context atm? |
Yes, sure! |
elif isinstance(step, NUTS) and auto_nuts_init: | ||
if "nuts" in kwargs: | ||
nuts_kwargs = kwargs.pop("nuts") | ||
[kwargs.setdefault(k, v) for k, v in nuts_kwargs.items()] | ||
_log.info("Auto-assigning NUTS sampler...") | ||
initial_points, step = init_nuts( | ||
init=init, | ||
chains=chains, | ||
n_init=n_init, | ||
model=model, | ||
seeds=random_seed, | ||
progressbar=progressbar, | ||
jitter_max_retries=jitter_max_retries, | ||
tune=tune, | ||
initvals=initvals, | ||
**kwargs, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not confident about this. As I commented before, the try/catch
block caught not-implemented gradients, which we're probably not testing.
Also I'm sceptical about the pattern of having isinstance(step, NUTS)
and if "nuts" in kwargs
. It reads like a hack to account for some specific cases, and it's not as general as the previous implementation.
You're the ones who've been looking into this much more than me, so feel free to overrule this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The assign_step_methods
checks for gradients as well. If that does not do a good enough job of testing if NUTS is appropriate then it should be treated as a bug and be fixed there
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, you're right. Also we're testing it: https://app.codecov.io/gh/pymc-devs/pymc/blob/18346acc797c32ccfa2d61ebc8f368ef03bc216c/pymc/sampling.py
fixes #5620
Edit:
This also fixes #4291 by enforcing
init
to be a string