Skip to content

Commit

Permalink
Remove deprecated FixedNoiseGP (#2536)
Browse files Browse the repository at this point in the history
Summary:
Pull Request resolved: #2536

This model has been deprecated for a while and replaced by `SingleTaskGP`.

Reviewed By: saitcakmak

Differential Revision: D62680578

fbshipit-source-id: e3a17f97ffb0d22fe929a31c9de8e71823f5b83c
  • Loading branch information
Balandat authored and facebook-github-bot committed Sep 16, 2024
1 parent e5c7451 commit 18eb95a
Show file tree
Hide file tree
Showing 8 changed files with 30 additions and 87 deletions.
7 changes: 1 addition & 6 deletions botorch/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,11 +17,7 @@
from botorch.models.fully_bayesian import SaasFullyBayesianSingleTaskGP
from botorch.models.fully_bayesian_multitask import SaasFullyBayesianMultiTaskGP

from botorch.models.gp_regression import (
FixedNoiseGP,
HeteroskedasticSingleTaskGP,
SingleTaskGP,
)
from botorch.models.gp_regression import HeteroskedasticSingleTaskGP, SingleTaskGP
from botorch.models.gp_regression_fidelity import SingleTaskMultiFidelityGP
from botorch.models.gp_regression_mixed import MixedSingleTaskGP
from botorch.models.higher_order_gp import HigherOrderGP
Expand All @@ -34,7 +30,6 @@
"AffineDeterministicModel",
"AffineFidelityCostModel",
"ApproximateGPyTorchModel",
"FixedNoiseGP",
"SaasFullyBayesianSingleTaskGP",
"SaasFullyBayesianMultiTaskGP",
"GenericDeterministicModel",
Expand Down
65 changes: 15 additions & 50 deletions botorch/models/gp_regression.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,22 +10,22 @@
These models are often a good starting point and are further documented in the
tutorials.
`SingleTaskGP`, `FixedNoiseGP`, and `HeteroskedasticSingleTaskGP` are all
single-task exact GP models, differing in how they treat noise. They use
relatively strong priors on the Kernel hyperparameters, which work best when
covariates are normalized to the unit cube and outcomes are standardized (zero
mean, unit variance).
`SingleTaskGP` and `HeteroskedasticSingleTaskGP` are single-task exact GP models,
differing in how they treat noise. They use relatively strong priors on the Kernel
hyperparameters, which work best when covariates are normalized to the unit cube
and outcomes are standardized (zero mean, unit variance). By default, these models
use a `Standardize` outcome transform, which applies this standardization. However,
they do not (yet) use an input transform by default.
These models all work in batch mode (each batch having its own hyperparameters).
When the training observations include multiple outputs, these models use
batching to model outputs independently.
These models all support multiple outputs. However, as single-task models,
`SingleTaskGP`, `FixedNoiseGP`, and `HeteroskedasticSingleTaskGP` should be
used only when the outputs are independent and all use the same training data.
If outputs are independent and outputs have different training data, use the
`ModelListGP`. When modeling correlations between outputs, use a multi-task
model like `MultiTaskGP`.
`SingleTaskGP` and `HeteroskedasticSingleTaskGP` should be used only when the
outputs are independent and all use the same training data. If outputs are
independent and outputs have different training data, use the `ModelListGP`.
When modeling correlations between outputs, use a multi-task model like `MultiTaskGP`.
"""

from __future__ import annotations
Expand Down Expand Up @@ -255,49 +255,14 @@ def forward(self, x: Tensor) -> MultivariateNormal:
return MultivariateNormal(mean_x, covar_x)


class FixedNoiseGP(SingleTaskGP):
r"""A single-task exact GP model using fixed noise levels.
DEPRECATED: `FixedNoiseGP` has been merged into `SingleTaskGP`. Please use
`SingleTaskGP` with `train_Yvar` instead.
Will be removed in a future release (~v0.12).
"""

def __init__(
self,
train_X: Tensor,
train_Y: Tensor,
train_Yvar: Tensor,
covar_module: Optional[Module] = None,
mean_module: Optional[Mean] = None,
outcome_transform: Optional[OutcomeTransform] = None,
input_transform: Optional[InputTransform] = None,
) -> None:
r"""DEPRECATED. See SingleTaskGP."""
warnings.warn(
"`FixedNoiseGP` has been merged into `SingleTaskGP`. "
"Please use `SingleTaskGP` with `train_Yvar` instead.",
DeprecationWarning,
)
super().__init__(
train_X=train_X,
train_Y=train_Y,
train_Yvar=train_Yvar,
covar_module=covar_module,
mean_module=mean_module,
outcome_transform=outcome_transform,
input_transform=input_transform,
)


class HeteroskedasticSingleTaskGP(BatchedMultiOutputGPyTorchModel, ExactGP):
r"""A single-task exact GP model using a heteroskedastic noise model.
This model differs from `SingleTaskGP` in that noise levels are provided
rather than inferred, and differs from `FixedNoiseGP` in that it can
predict noise levels out of sample, because it internally wraps another
GP (a SingleTaskGP) to model the observation noise.
Noise levels must be provided to `HeteroskedasticSingleTaskGP` as `train_Yvar`.
This model differs from `SingleTaskGP` with observed observation noise
variances (`train_Yvar`) in that it can predict noise levels out of sample.
This is achieved by internally wrapping another GP (a `SingleTaskGP`) to model
the (log of) the observation noise. Noise levels must be provided to
`HeteroskedasticSingleTaskGP` as `train_Yvar`.
Examples of cases in which noise levels are known include online
experimentation and simulation optimization.
Expand Down
23 changes: 3 additions & 20 deletions test/models/test_gp_regression.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,7 @@
import torch
from botorch.exceptions.warnings import OptimizationWarning
from botorch.fit import fit_gpytorch_mll
from botorch.models.gp_regression import (
FixedNoiseGP,
HeteroskedasticSingleTaskGP,
SingleTaskGP,
)
from botorch.models.gp_regression import HeteroskedasticSingleTaskGP, SingleTaskGP
from botorch.models.transforms import Normalize, Standardize
from botorch.models.transforms.input import InputStandardize
from botorch.models.transforms.outcome import Log
Expand Down Expand Up @@ -459,8 +455,7 @@ def test_construct_inputs_task_feature_deprecated(self) -> None:
model.construct_inputs(training_data, task_feature=0)


class TestFixedNoiseGP(TestSingleTaskGP):
model_class = FixedNoiseGP
class TestSingleTaskGPFixedNoise(TestSingleTaskGP):

def _get_model_and_data(
self,
Expand All @@ -480,14 +475,7 @@ def _get_model_and_data(
"input_transform": input_transform,
"outcome_transform": outcome_transform,
}
if self.model_class is FixedNoiseGP:
with self.assertWarnsRegex(
DeprecationWarning,
"`FixedNoiseGP` has been merged into `SingleTaskGP`. ",
):
model = FixedNoiseGP(**model_kwargs, **extra_model_kwargs)
else:
model = self.model_class(**model_kwargs, **extra_model_kwargs)
model = SingleTaskGP(**model_kwargs, **extra_model_kwargs)
return model, model_kwargs

def _get_extra_model_kwargs(self):
Expand Down Expand Up @@ -598,11 +586,6 @@ def test_fantasized_noise(self):
)


class TestFixedNoiseSingleTaskGP(TestFixedNoiseGP):
# Repeat the FixedNoiseGP tests using SingleTaskGP.
model_class = SingleTaskGP


class TestHeteroskedasticSingleTaskGP(TestGPRegressionBase):
def _get_model_and_data(
self, batch_shape, m, outcome_transform=None, input_transform=None, **tkwargs
Expand Down
2 changes: 1 addition & 1 deletion tutorials/batch_mode_cross_validation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@
"source": [
"### Perform LOOCV\n",
"\n",
"We can use the `batch_cross_validation` function to perform LOOCV using batching (meaning that the `b = 20` sets of training data can be fit as `b = 20` separate GP models with separate hyperparameters in parallel through GPyTorch) and return a CVResult tuple with the batched `GPyTorchPosterior` object over the LOOCV test points and the observed targets. The `batch_cross_validation` requires a model class (`model_cls`) and a marginal log likelihood class (`mll_cls`). Since we have an observed and constant noise level, we will use the FixedNoiseGP as the `model_cls` and an ExactMarginalLogLikelihood as the `mll_cls`."
"We can use the `batch_cross_validation` function to perform LOOCV using batching (meaning that the `b = 20` sets of training data can be fit as `b = 20` separate GP models with separate hyperparameters in parallel through GPyTorch) and return a CVResult tuple with the batched `GPyTorchPosterior` object over the LOOCV test points and the observed targets. The `batch_cross_validation` requires a model class (`model_cls`) and a marginal log likelihood class (`mll_cls`). We will use the SingleTaskGP as the `model_cls` and an ExactMarginalLogLikelihood as the `mll_cls`."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion tutorials/bo_with_warped_gp.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@
"metadata": {},
"source": [
"#### Input warping and model initialization\n",
"We initialize the `Warp` input transformation and pass it a `FixedNoiseGP` to model the noiseless objective. The `Warp` object is a `torch.nn.Module` that contains the concentration parameters and applies the warping function in the `Model`'s `forward` pass."
"We initialize the `Warp` input transformation and pass it a `SingleTaskGP` to model the noiseless objective. The `Warp` object is a `torch.nn.Module` that contains the concentration parameters and applies the warping function in the `Model`'s `forward` pass."
]
},
{
Expand Down
8 changes: 4 additions & 4 deletions tutorials/closed_loop_botorch_only.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@
"\n",
"We use a `MultiOutputGP` to model the objective (output 0) and the constraint (output 1). We assume known homoskedastic observation noise on both the objective and constraint with standard error $\\sigma = 0.5$. \n",
"\n",
"Each component is a `FixedNoiseGP`. The models are initialized with 10 points drawn randomly from $[0,1]^6$."
"Each component is a `SingleTaskGP`. The models are initialized with 10 points drawn randomly from $[0,1]^6$."
]
},
{
Expand All @@ -123,7 +123,7 @@
"outputs": [],
"source": [
"from botorch.models.transforms.input import Normalize\n",
"from botorch.models import FixedNoiseGP, ModelListGP\n",
"from botorch.models import SingleTaskGP, ModelListGP\n",
"from gpytorch.mlls.sum_marginal_log_likelihood import SumMarginalLogLikelihood\n",
"\n",
"NOISE_SE = 0.25\n",
Expand All @@ -143,13 +143,13 @@
"\n",
"def initialize_model(train_x, train_obj, train_con, state_dict=None):\n",
" # define models for objective and constraint\n",
" model_obj = FixedNoiseGP(\n",
" model_obj = SingleTaskGP(\n",
" train_x,\n",
" train_obj,\n",
" train_yvar.expand_as(train_obj),\n",
" input_transform=Normalize(d=train_x.shape[-1]),\n",
" ).to(train_x)\n",
" model_con = FixedNoiseGP(\n",
" model_con = SingleTaskGP(\n",
" train_x,\n",
" train_con,\n",
" train_yvar.expand_as(train_con),\n",
Expand Down
6 changes: 3 additions & 3 deletions tutorials/multi_objective_bo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@
"source": [
"#### Model initialization\n",
"\n",
"We use a list of `FixedNoiseGP`s to model the two objectives with known noise variances. Homoskedastic noise levels can be inferred by using `SingleTaskGP`s instead of `FixedNoiseGP`s.\n",
"We use a list of `SingleTaskGP`s to model the two objectives with known noise variances. If no noise variances were provided, `SingleTaskGP` would infer (homoskedastic) noise levels instead.\n",
"\n",
"The models are initialized with $2(d+1)=6$ points drawn randomly from $[0,1]^2$."
]
Expand All @@ -135,7 +135,7 @@
"customOutput": null
},
"source": [
"from botorch.models.gp_regression import FixedNoiseGP\n",
"from botorch.models.gp_regression import SingleTaskGP\n",
"from botorch.models.model_list_gp_regression import ModelListGP\n",
"from botorch.models.transforms.outcome import Standardize\n",
"from gpytorch.mlls.sum_marginal_log_likelihood import SumMarginalLogLikelihood\n",
Expand All @@ -161,7 +161,7 @@
" train_y = train_obj[..., i : i + 1]\n",
" train_yvar = torch.full_like(train_y, NOISE_SE[i] ** 2)\n",
" models.append(\n",
" FixedNoiseGP(\n",
" SingleTaskGP(\n",
" train_x, train_y, train_yvar, outcome_transform=Standardize(m=1)\n",
" )\n",
" )\n",
Expand Down
4 changes: 2 additions & 2 deletions tutorials/robust_multi_objective_bo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -317,7 +317,7 @@
"executionStopTime": 1668650187975
},
"source": [
"from botorch.models.gp_regression import FixedNoiseGP\n",
"from botorch.models.gp_regression import SingleTaskGP\n",
"from botorch.models.model_list_gp_regression import ModelListGP\n",
"from gpytorch.mlls import SumMarginalLogLikelihood\n",
"from botorch.models.transforms.outcome import Standardize\n",
Expand All @@ -336,7 +336,7 @@
" models = []\n",
" for i in range(train_y.shape[-1]):\n",
" models.append(\n",
" FixedNoiseGP(\n",
" SingleTaskGP(\n",
" train_X=train_x,\n",
" train_Y=train_y[..., i : i + 1],\n",
" train_Yvar=train_Yvar[..., i : i + 1],\n",
Expand Down

0 comments on commit 18eb95a

Please sign in to comment.