Releases: pymc-devs/pymc
PyMC 4.0.0
If you want a description of the highlights of this release, check out the release announcement on our new website.
Feel free to read it, print it out, and give it to people on the street -- because everybody has to know PyMC 4.0 is officially out 🍾
Do not miss 🚨
⚠️ The project was renamed to "PyMC". Now the library is installed as "pip install pymc" and imported likeimport pymc as pm
. See this migration guide for more details.⚠️ Theano-PyMC has been replaced with Aesara, so all external references totheano
andtt
need to be replaced withaesara
andat
, respectively (see 4471).⚠️ Support for JAX and JAX samplers, also allows sampling on GPUs. This benchmark shows speed-ups of up to 11x.⚠️ The GLM submodule was removed, please use Bambi instead.⚠️ PyMC now requires SciPy version>= 1.4.1
(see #4857).
v3 features not yet working in v4 ⏳
- MvNormalRandomWalk, MvStudentTRandomWalk, GARCH11 and EulerMaruyama distributions (see #4642)
- Nested Mixture distributions (see #5533)
pm.sample_posterior_predictive_w
(see #4807)- Partially observed Multivariate distributions (see #5260)
New features 🥳
-
Distributions:
-
Univariate censored distributions are now available via
pm.Censored
. #5169 -
The
CAR
distribution has been added to allow for use of conditional autoregressions which often are used in spatial and network models. -
Added a
logcdf
implementation for the Kumaraswamy distribution (see #4706). -
The
OrderedMultinomial
distribution has been added for use on ordinal data which are aggregated by trial, like multinomial observations, whereasOrderedLogistic
only accepts ordinal data in a disaggregated format, like categorical observations (see #4773). -
The
Polya-Gamma
distribution has been added (see #4531). To make use of this distribution, thepolyagamma>=1.3.1
library must be installed and available in the user's environment. -
pm.DensityDist
can now accept an optionallogcdf
keyword argument to pass in a function to compute the cummulative density function of the distribution (see 5026). -
pm.DensityDist
can now accept an optionalmoment
keyword argument to pass in a function to compute the moment of the distribution (see 5026). -
Added an alternative parametrization,
logit_p
topm.Binomial
andpm.Categorical
distributions (see 5637).
-
-
Model dimensions:
- The dimensionality of model variables can now be parametrized through either of
shape
ordims
(see #4696):- With
shape
the length of dimensions must be given numerically or as scalar AesaraVariables
. Numeric entries inshape
restrict the model variable to the exact length and re-sizing is no longer possible. dims
keeps model variables re-sizeable (for example throughpm.Data
) and leads to well defined coordinates inInferenceData
objects.- An
Ellipsis
(...
) in the last position ofshape
ordims
can be used as short-hand notation for implied dimensions.
- With
- New features for
pm.Data
containers:- With
pm.Data(..., mutable=False)
, or by usingpm.ConstantData()
one can now createTensorConstant
data variables. These can be more performant and compatible in situations where a variable doesn't need to be changed viapm.set_data()
. See #5295. If you do need to change the variable, usepm.Data(..., mutable=True)
, orpm.MutableData()
. - New named dimensions can be introduced to the model via
pm.Data(..., dims=...)
. For mutable data variables (see above) the lengths of these dimensions are symbolic, so they can be re-sized viapm.set_data()
. pm.Data
now passes additional kwargs toaesara.shared
/at.as_tensor
. #5098.
- With
- The length of
dims
in the model is now tracked symbolically throughModel.dim_lengths
(see #4625).
- The dimensionality of model variables can now be parametrized through either of
-
Sampling:
⚠️ Random seeding behavior changed (see #5787)!- Sampling results will differ from those of v3 when passing the same
random_seed
as before. They will be consistent across subsequent v4 releases unless mentioned otherwise. - Sampling functions no longer respect user-specified global seeding! Always pass
random_seed
to ensure reproducible behavior. random_seed
now accepts RandomState and Generators besides integers.
- Sampling results will differ from those of v3 when passing the same
- A small change to the mass matrix tuning methods jitter+adapt_diag (the default) and adapt_diag improves performance early on during tuning for some models. #5004
- New experimental mass matrix tuning method jitter+adapt_diag_grad. #5004
- Support for samplers written in JAX:
- Adding support for numpyro's NUTS sampler via
pymc.sampling_jax.sample_numpyro_nuts()
- Adding support for blackjax's NUTS sampler via
pymc.sampling_jax.sample_blackjax_nuts()
(see #5477) pymc.sampling_jax
samplers supportlog_likelihood
,observed_data
, andsample_stats
in returnedInferenceData
object (see #5189)- Adding support for
pm.Deterministic
inpymc.sampling_jax
(see #5182)
- Adding support for numpyro's NUTS sampler via
-
Miscellaneous:
- The new
pm.find_constrained_prior
function can be used to find optimized prior parameters of a distribution under some
constraints (e.g lower and upper bound). See #5231. - Nested models now inherit the parent model's coordinates. #5344
softmax
andlog_softmax
functions added tomath
module (see #5279).- Added the low level
compile_forward_sampling_function
method to compile the aesara function responsible for generating forward samples (see #5759).
- The new
Expected breaking changes 💔
pm.sample(return_inferencedata=True)
is now the default (see #4744).- ArviZ
plots
andstats
wrappers were removed. The functions are now just available by their original names (see #4549 and3.11.2
release notes). pm.sample_posterior_predictive(vars=...)
kwarg was removed in favor ofvar_names
(see #4343).ElemwiseCategorical
step method was removed (see #4701)LKJCholeskyCov
'scompute_corr
keyword argument is now set toTrue
by default (see#5382)- Alternative
sd
keyword argument has been removed from all distributions.sigma
should be used instead (see #5583).
Read on if you're a developer. Or curious. Or both.
Unexpected breaking changes (action needed) 😲
Very important ⚠️
pm.Bound
interface no longer accepts a callable class as argument, instead it requires an instantiated distribution (created via the.dist()
API) to be passed as an argument. In addition, Bound no longer returns a class instance but works as a normal PyMC distribution. Finally, it is no longer possible to do predictive random sampling from Bounded variables. Please, consult the new documentation for details on how to use Bounded variables (see 4815).- BART has received various updates (5091, 5177, 5229, 4914) but was removed from the main package in #5566. It is now available from pymc-experimental.
- Removed
AR1
.AR
of order 1 should be used instead. (see 5734). - The
pm.EllipticalSlice
sampler was removed (see #5756). BaseStochasticGradient
was removed (see #5630)pm.Distribution(...).logp(x)
is nowpm.logp(pm.Distribution(...), x)
.pm.Distribution(...).logcdf(x)
is nowpm.logcdf(pm.Distribution(...), x)
.pm.Distribution(...).random(size=x)
is nowpm.draw(pm.Distribution(...), draws=x)
.pm.draw_values(...)
and `pm.genera...
4.0.0 beta 6
What's Changed
- Implemented default transform for Mixtures by @ricardoV94 in #5636
- Scope separator for netcdf by @ferrine in #5663
- Fix default update bug by @ricardoV94 in #5667
- Pandas dependency was removed by @thomasjpfan in #5633
- Recognize cast data in InferenceData by @zaxtax in #5646
- Updated docstrings of multiple distributions by @purna135 in #5595, #5596 and #5600
- Refine Interval docstrings and fix typo by @ricardoV94 in #5640
- Add test for interactions between missing, default and explicit updates in
compile_pymc
by @ricardoV94 in #5645 - Test reshape from observed by @ricardoV94 in #5670
- Upgraded all CI cache actions to v3 by @michaelosthege in #5647
Full Changelog: v4.0.0b5...v4.0.0b6
4.0.0 beta 5
What's Changed
- Generalize multinomial moment to arbitrary dimensions by @markvrma in #5476
- Remove sd optional kwarg from distributions by @purna135 in #5583
- Improve scoped models by @ferrine in #5607
- Add helper wrapper aound Interval transform by @ricardoV94 in #5347
- Rename
logp_transform
to_get_default_transform
by @ricardoV94 in #5612 - Do not set RNG updates inplace in compile_pymc by @ricardoV94 in #5615
- Refine trigger filter for both PRs and pushes by @michaelosthege in #5619
- Update contributing guide with etiquette section by @michaelosthege in #5611
- Combine test workflows into one by @michaelosthege in #5623
- Raise ValueError if random variables are present in the logp graph by @ricardoV94 in #5614
- Run float32 jobs separately by @michaelosthege in #5630
- Bring back sampler argument target_accept by @aloctavodia in #5622
- Parametrize Binomial and Categorical distributions via logit_p by @purna135 in #5637
- Remove SGMCMC and fix flaky mypy results by @michaelosthege in #5631
Full Changelog: v4.0.0b4...v4.0.0b5
v4.0.0 beta 4
This release adds the following major improvements:
- Refactor Mixture distribution for V4 by @ricardoV94 in #5438
- Adding NUTS sampler from blackjax to sampling_jax by @zaxtax in #5477
- Update aesara and aeppl dependencies to fix a memory leak in pymc models by @ricardoV94 in #5582
New Contributors
- @mirko-m made their first contribution in #5414
- @chritter made their first contribution in #5491
- @5hv5hvnk made their first contribution in #5601
Full Changelog: v4.0.0b3...v4.0.0b4
PyMC 3.11.5
PyMC 3.11.5 (15 March 2022)
This is a backport & bugfix release that eases the transition to pymc >=4.0.0
.
Backports
- The
pm.logp(rv, x)
syntax is now available and recommended to make your model codev4
-ready. Note that this backport is just an alias and much less capable than what's available withpymc >=4
(see #5083). - The
pm.Distribution(testval=...)
kwarg was deprecated and will be replaced bypm.Distribution(initval=...)
inpymc >=4
(see #5226). - The
pm.sample(start=...)
kwarg was deprecated and will be replaced bypm.sample(initvals=...)
inpymc >=4
(see #5226). pm.LogNormal
is now available as an alias forpm.Lognormal
(see #5389).
Bugfixes
PyMC 4.0.0 beta 3
Here is the full list of changes compared to 4.0.0b2
.
For a current list of changes w.r.t. the upcoming v3.11.5
see RELEASE-NOTES.md
.
Notable changes & features
- ADVI has been ported to PyMC 4
- LKJ has been ported to PyMC 4 (#5382)
- Dependencies have been updated
v4.0.0b2
PyMC 4.0.0 beta 2
This beta release includes the removal of warnings, polishing of APIs, more distributions and internal refactorings.
Here is the full list of changes compared to 4.0.0b1
.
For a current list of changes w.r.t. the upcoming v3.11.5
see RELEASE-NOTES.md
.
Notable changes & features
- Introduction of
pm.Data(..., mutable=False/True)
and correspondingpm.ConstantData
/pm.MutableData
wrappers (see #5295). - The warning about
theano
orpymc3
being installed in parallel was removed. dims
can again be specified alongsideshape
orsize
(see #5325).pm.draw
was added to draw prior samples from a variable (see #5340).- Renames of model properties & methods like
Model.logpt
. - A function to find a prior based on lower/upper bounds (see #5231).
v4.0.0b1
PyMC 4.0.0 beta 1
⚠ This is the first beta of the next major release for PyMC 4.0.0 (formerly PyMC3). 4.0.0 is a rewrite of large parts of the PyMC code base which make it faster, adds many new features, and introduces some breaking changes. For the most part, the API remains stable and we expect that most models will work without any changes.
Not-yet working features
We plan to get these working again, but at this point, their inner workings have not been refactored.
- Timeseries distributions (see #4642)
- Mixture distributions (see #4781)
- Cholesky distributions (see WIP PR #4784)
- Variational inference submodule (see WIP PR #4582)
- Elliptical slice sampling (see #5137)
BaseStochasticGradient
(see #5138)pm.sample_posterior_predictive_w
(see #4807)- Partially observed Multivariate distributions (see #5260)
Also, check out the milestones for a potentially more complete list.
Unexpected breaking changes (action needed)
- New API is not available in
v3.11.5
. - Old API does not work in
v4.0.0
.
All of the above applies to:
- ⚠ The library is now named, installed, and imported as "pymc". For example:
pip install pymc
. (Usepip install pymc --pre
while we are in the pre-release phase.) - ⚠ Theano-PyMC has been replaced with Aesara, so all external references to
theano
,tt
, andpymc3.theanof
need to be replaced withaesara
,at
, andpymc.aesaraf
(see 4471). pm.Distribution(...).logp(x)
is nowpm.logp(pm.Distribution(...), x)
pm.Distribution(...).logcdf(x)
is nowpm.logcdf(pm.Distribution(...), x)
pm.Distribution(...).random()
is nowpm.Distribution(...).eval()
pm.draw_values(...)
andpm.generate_samples(...)
were removed. The tensors can now be evaluated with.eval()
.pm.fast_sample_posterior_predictive
was removed.pm.sample_prior_predictive
,pm.sample_posterior_predictive
andpm.sample_posterior_predictive_w
now return anInferenceData
object by default, instead of a dictionary (see #5073).pm.sample_prior_predictive
no longer returns transformed variable values by default. Pass them by name invar_names
if you want to obtain these draws (see 4769).pm.sample(trace=...)
no longer acceptsMultiTrace
orlen(.) > 0
traces (see 5019#).- The GLM submodule was removed, please use Bambi instead.
pm.Bound
interface no longer accepts a callable class as an argument, instead, it requires an instantiated distribution (created via the.dist()
API) to be passed as an argument. In addition, Bound no longer returns a class instance but works as a normal PyMC distribution. Finally, it is no longer possible to do predictive random sampling from Bounded variables. Please, consult the new documentation for details on how to use Bounded variables (see 4815).pm.logpt(transformed=...)
kwarg was removed (816b5f).Model(model=...)
kwarg was removedModel(theano_config=...)
kwarg was removedModel.size
property was removed (useModel.ndim
instead).dims
andcoords
handling:Model.update_start_values(...)
was removed. Initial values can be set in theModel.initial_values
dictionary directly.- Test values can no longer be set through
pm.Distribution(testval=...)
and must be assigned manually. Transform.forward
andTransform.backward
signatures changed.pm.DensityDist
no longer accepts thelogp
as its first positional argument. It is now an optional keyword argument. If you pass a callable as the first positional argument, aTypeError
will be raised (see 5026).pm.DensityDist
now accepts distribution parameters as positional arguments. Passing them as a dictionary in theobserved
keyword argument is no longer supported and will raise an error (see 5026).- The signature of the
logp
andrandom
functions that can be passed into apm.DensityDist
has been changed (see 5026). - Changes to the Gaussian process (
gp
) submodule:- The
gp.prior(..., shape=...)
kwarg was renamed tosize
. - Multiple methods including
gp.prior
now require explicit kwargs.
- The
- Changes to the BART implementation:
- Changes to the Gaussian Process (GP) submodule (see 5055):
- For all implementations,
gp.Latent
,gp.Marginal
etc.,cov_func
andmean_func
are required kwargs. - In Windows test conda environment the
mkl
version is fixed to verison 2020.4, andmkl-service
is fixed to2.3.0
. This was required forgp.MarginalKron
to function properly. gp.MvStudentT
uses rotated samples fromStudentT
directly now, instead of sampling frompm.Chi2
and then frompm.Normal
.- The "jitter" parameter, or the diagonal noise term added to Gram matrices such that the Cholesky is numerically stable, is now exposed to the user instead of hard-coded. See the function
gp.util.stabilize
. - The
is_observed
argument forgp.Marginal*
implementations has been deprecated. - In the gp.utils file, the
kmeans_inducing_points
function now passes throughkmeans_kwargs
to scipy's k-means function. - The function
replace_with_values
function has been added togp.utils
. MarginalSparse
has been renamedMarginalApprox
.
- For all implementations,
Expected breaks
- New API was already available in
v3
. - Old API had deprecation warnings since at least
3.11.0
(2021-01). - Old API stops working in
v4
(preferably with informative errors).
All of the above apply to:
pm.sample(return_inferencedata=True)
is now the default (see #4744).- ArviZ
plots
andstats
wrappers were removed. The functions are now just available by their original names (see #4549 and3.11.2
release notes). pm.sample_posterior_predictive(vars=...)
kwarg was removed in favor ofvar_names
(see #4343).ElemwiseCategorical
step method was removed (see #4701)
Ongoing deprecations
- Old API still works in
v4
and has a deprecation warning. - Preferably the new API should be available in
v3
already
New features
- The length of
dims
in the model is now tracked symbolically throughModel.dim_lengths
(see #4625). - The
CAR
distribution has been added to allow for use of conditional autoregressions which often are used in spatial and network models. - The dimensionality of model variables can now be parametrized through either of
shape
,dims
orsize
(see #4696):- With
shape
the length of dimensions must be given numerically or as scalar AesaraVariables
. Numeric entries inshape
restrict the model variable to the exact length and re-sizing is no longer possible. dims
keeps model variables re-sizeable (for example throughpm.Data
) and leads to well-defined coordinates inInferenceData
objects.- The
size
kwarg behaves as it does in Aesara/NumPy. For univariate RVs it is the same asshape
, but for multivariate RVs it depends on how the RV implements broadcasting to dimensionality greater thanRVOp.ndim_supp
. - An
Ellipsis
(...
) in the last position ofshape
ordims
can be used as shorthand notation for implied dimensions.
- With
- Added a
logcdf
implementation for the Kumaraswamy distribution (see #4706). - The
OrderedMultinomial
distribution has been added for use on ordinal data which are aggregated by trial, like multinomial observations, whereasOrderedLogistic
only accepts ordinal data in a disaggregated format, like categorical
observations (see #4773). - The
Polya-Gamma
distribution has been added (see #4531). To make use of this distribution, thepolyagamma>=1.3.1
library must be installed and available in the user's environment. - A small change to the mass matrix tuning methods jitter+adapt_diag (the default) and adapt_diag improves performance early on during tuning for some models. #5004
- New experimental mass matrix tuning method jitter+adapt_diag_grad. [#5004](https://github.com/pymc-devs/pymc/pu...
PyMC3 3.11.4 (20 August 2021)
Update __init__.py Update RELEASE-NOTES.md Mark 3.11.3 release as broken per discussion
PyMC3 3.11.3 (19 August 2021)
Release PyMC3 v3.11.3 (#4941) * Release PyMC3 v3.11.3 * Update RELEASE-NOTES.md