From d5e2dad56f89cbb4d5065acd806145b68c2ba196 Mon Sep 17 00:00:00 2001 From: Elizabeth Santorella Date: Tue, 17 Sep 2024 07:57:13 -0700 Subject: [PATCH] CHANGELOG for 0.12.0 release (#2522) Summary: Pull Request resolved: https://github.com/pytorch/botorch/pull/2522 changelog Reviewed By: saitcakmak Differential Revision: D62405336 --- CHANGELOG.md | 67 ++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 67 insertions(+) diff --git a/CHANGELOG.md b/CHANGELOG.md index c8e63a62ab..2ab4901f1f 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -2,6 +2,73 @@ The release log for BoTorch. +## [0.12.0] -- Sep 17, 2024 + +#### Major changes +* Update default hyperparameter priors for most models to use dimension-scaled log-normal + priors, which make performance much more robust to dimensionality. See + discussion #2451 for details. The only models that are not changed are fully + Bayesian models, `PairwiseGP`, and fidelity kernels for multi-fidelity models; + for models that utilize a composite kernel, such as + multi-fidelity/task/context, this change only affects the base kernel (#2449, + #2450, #2507). +* Use `Standarize` by default with all the models using the upgraded priors. In + addition to reducing the amount of boilerplate needed to initialize a model, + this change was motivated by the change to default priors, because the new + priors will work less well when data is not standardized. Users who do not + want to use transforms should explicitly pass in `None` (#2458, #2532). + +#### Compatibility +* Unpin Numpy (#2459). +* Require PyTorch>=2.0.1, GPyTorch==1.13, and linear_operator==0.5.3 (#2511). + +#### New features +* Introduce `PathwiseThompsonSampling` acquisition function (#2443). +* Enable `qBayesianActiveLearningByDisagreement` to accept a posterior + transform, and improve its implementation (#2457). +* Enable `SaasPyroModel` to sample via NUTS when training data is empty (#2465). +* Add multi-objective `qBayesianActiveLearningByDisagreement` (#2475). +* Add input constructor for `qNegIntegratedPosteriorVariance` (#2477). +* Introduce `qLowerConfidenceBound` (#2517). +* Add input constructor for `qMultiFidelityHypervolumeKnowledgeGradient` (#2524). +* Add `posterior_transform` to `ApproximateGPyTorchModel.posterior` (#2531). + +#### Bug fixes +* Fix `batch_shape` default in `OrthogonalAdditiveKernel`. +* Ensure all tensors are on CPU in `HitAndRunPolytopeSampler` (#2502). +* Fix duplicate logging in `generation/gen.py` (#2504). +* Raise exception if `X_pending` is set on underlying `AcquisitionFunction` in + prior-guided `AcquisitionFunction` (#2505). +* Affine input transforms error with data of incorrect dimension, even in eval + mode (#2510). +* Use fidelity-aware `current_value` in input constructor for `qMultiFidelityKnowledgeGradient` (#2519). +* Apply input transforms when computing MLL in model closures (#2527). +* Detach `fval` in `torch_minimize` to remove an opportunity for memory leaks + (#2529). + +#### Documentation +* Clarify incompatibility of inter-point constraints with `get_polytope_samples` + (#2469). +* Update tutorials to use the log variants of EI-family acquisition functions, + don't make tutorials pass `Standardize` unnecessarily, and other + simplifications and cleanup (#2462, #2463, #2490, #2495, #2496, #2498, #2499). +* Remove deprecated `FixedNoiseGP` (#2536). + +#### Other changes +* More informative warnings about failure to standardize or normalize data + (#2489). +* Suppress irrelevant warnings in `qHypervolumeKnowledgeGradient` helpers + (#2486). +* Cleaner `botorch/acquisition/multi_objective` directory structure (#2485). +* For `AffineInputTransform`, always require data to have at least two dimensions + (#2518). +* Remove deprecated argument `data_fidelity` to `SingleTaskMultiFidelityGP` and + deprecated model `FixedNoiseMultiFidelityGP` (#2532). +* Raise an `OptimizationGradientError` when optimization produces NaN gradients (#2537). +* Improve numerics by replacing `torch.log(1 + x)` with `torch.log1p(x)` + and `torch.exp(x) - 1` with `torch.special.expm1` (#2539, #2540, #2541). + + ## [0.11.3] -- Jul 22, 2024 #### Compatibility