Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use unit normal as default init_dist in GaussianRandomWalk and AR #5779

Merged
merged 2 commits into from
May 19, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 9 additions & 6 deletions RELEASE-NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,11 +25,14 @@ Also check out the [milestones](https://github.com/pymc-devs/pymc/milestones) fo

All of the above apply to:

Signature and default parameters changed for several distributions (see [#5628](https://github.com/pymc-devs/pymc/pull/5628)):
- `pm.StudentT` now requires either `sigma` or `lam` as kwarg
- `pm.StudentT` now requires `nu` to be specified (no longer defaults to 1)
- `pm.AsymmetricLaplace` positional arguments re-ordered
- `pm.AsymmetricLaplace` now requires `mu` to be specified (no longer defaults to 0)
Signature and default parameters changed for several distributions:
- `pm.StudentT` now requires either `sigma` or `lam` as kwarg (see [#5628](https://github.com/pymc-devs/pymc/pull/5628))
- `pm.StudentT` now requires `nu` to be specified (no longer defaults to 1) (see [#5628](https://github.com/pymc-devs/pymc/pull/5628))
- `pm.AsymmetricLaplace` positional arguments re-ordered (see [#5628](https://github.com/pymc-devs/pymc/pull/5628))
- `pm.AsymmetricLaplace` now requires `mu` to be specified (no longer defaults to 0) (see [#5628](https://github.com/pymc-devs/pymc/pull/5628))
- `ZeroInflatedPoisson` `theta` parameter was renamed to `mu` (see [#5584](https://github.com/pymc-devs/pymc/pull/5584)).
- `pm.GaussianRandomWalk` initial distribution defaults to unit normal instead of flat (see[#5779](https://github.com/pymc-devs/pymc/pull/5779))
- `pm.AR` initial distribution defaults to unit normal instead of flat (see[#5779](https://github.com/pymc-devs/pymc/pull/5779))
- BART was removed [#5566](https://github.com/pymc-devs/pymc/pull/5566). It is now available from [pymc-experimental](https://github.com/pymc-devs/pymc-experimental)
- The `pm.EllipticalSlice` sampler was removed (see [#5756](https://github.com/pymc-devs/pymc/issues/5756)).
- `BaseStochasticGradient` was removed (see [#5630](https://github.com/pymc-devs/pymc/pull/5630))
Expand Down Expand Up @@ -80,7 +83,7 @@ Signature and default parameters changed for several distributions (see [#5628](
- The function `replace_with_values` function has been added to `gp.utils`.
- `MarginalSparse` has been renamed `MarginalApprox`.
- Removed `MixtureSameFamily`. `Mixture` is now capable of handling batched multivariate components (see [#5438](https://github.com/pymc-devs/pymc/pull/5438)).
- `ZeroInflatedPoisson` `theta` parameter was renamed to `mu` (see [#5584](https://github.com/pymc-devs/pymc/pull/5584)).
- Removed `AR1`, `AR` of order 1 should be used instead. (see [5734](https://github.com/pymc-devs/pymc/pull/5734)).
- ...

### Expected breaks
Expand Down
9 changes: 4 additions & 5 deletions pymc/distributions/timeseries.py
Original file line number Diff line number Diff line change
Expand Up @@ -225,7 +225,7 @@ class GaussianRandomWalk(distribution.Continuous):
sigma > 0, innovation standard deviation, defaults to 1.0
init : unnamed distribution
Univariate distribution of the initial value, created with the `.dist()` API.
Defaults to Normal with same `mu` and `sigma` as the GaussianRandomWalk
Defaults to a unit Normal.
.. warning:: init will be cloned, rendering them independent of the ones passed as input.
Expand Down Expand Up @@ -265,7 +265,7 @@ def dist(

# If no scalar distribution is passed then initialize with a Normal of same mu and sigma
if init is None:
init = Normal.dist(mu, sigma)
init = Normal.dist(0, 1)
else:
if not (
isinstance(init, at.TensorVariable)
Expand Down Expand Up @@ -361,7 +361,7 @@ class AR(SymbolicDistribution):
Whether the first element of rho should be used as a constant term in the AR
process. Defaults to False
init_dist: unnamed distribution, optional
Scalar or vector distribution for initial values. Defaults to Normal(0, sigma).
Scalar or vector distribution for initial values. Defaults to a unit Normal.
Distribution should be created via the `.dist()` API, and have dimension
(*size, ar_order). If not, it will be automatically resized.
Expand Down Expand Up @@ -452,8 +452,7 @@ def dist(
f"got ndim_supp={init_dist.owner.op.ndim_supp}.",
)
else:
# Sigma must broadcast with ar_order
init_dist = Normal.dist(sigma=at.shape_padright(sigma), size=(*sigma.shape, ar_order))
init_dist = Normal.dist(0, 1, size=(*sigma.shape, ar_order))

# Tell Aeppl to ignore init_dist, as it will be accounted for in the logp term
init_dist = ignore_logprob(init_dist)
Expand Down
2 changes: 1 addition & 1 deletion pymc/tests/test_distributions.py
Original file line number Diff line number Diff line change
Expand Up @@ -2610,7 +2610,7 @@ def test_gaussianrandomwalk(self):
def ref_logp(value, mu, sigma, steps):
# Relying on fact that init will be normal by default
return (
scipy.stats.norm.logpdf(value[0], mu, sigma)
scipy.stats.norm.logpdf(value[0])
+ scipy.stats.norm.logpdf(np.diff(value), mu, sigma).sum()
)

Expand Down
4 changes: 3 additions & 1 deletion pymc/tests/test_distributions_timeseries.py
Original file line number Diff line number Diff line change
Expand Up @@ -333,6 +333,7 @@ def test_batched_sigma(self):
"y",
beta_tp,
sigma=sigma,
init_dist=Normal.dist(0, sigma[..., None]),
size=batch_size,
steps=steps,
initval=y_tp,
Expand All @@ -346,6 +347,7 @@ def test_batched_sigma(self):
f"y_{i}{j}",
beta_tp,
sigma=sigma[i][j],
init_dist=Normal.dist(0, sigma[i][j]),
shape=steps,
initval=y_tp[i, j],
ar_order=ar_order,
Expand All @@ -371,7 +373,7 @@ def test_batched_init_dist(self):
beta_tp = aesara.shared(np.random.randn(ar_order), shape=(3,))
y_tp = np.random.randn(batch_size, steps)
with Model() as t0:
init_dist = Normal.dist(0.0, 0.01, size=(batch_size, ar_order))
init_dist = Normal.dist(0.0, 1.0, size=(batch_size, ar_order))
AR("y", beta_tp, sigma=0.01, init_dist=init_dist, steps=steps, initval=y_tp)
with Model() as t1:
for i in range(batch_size):
Expand Down