Skip to content

[Bug] AffineInputTransform does not check dimensions after training #2509

Closed
@slishak-PX

Description

@slishak-PX

🐛 Bug

AffineInputTransform does not always check the shape of tensors when transforming or untransforming, which can result in confusing behaviour: when a tensor of shape [..., 1] is passed and it expects a tensor of shape [..., d], it will broadcast it up to the expected dimension when (un)transforming.

To reproduce

** Code snippet to reproduce **

import torch
from botorch.models import SingleTaskGP
from botorch.models.transforms import Normalize

n_inputs = 4
n_outputs = 1
n_train = 256
n_test = 16
device = torch.device("cpu")

train_x = torch.rand(n_train, n_inputs, dtype=torch.float64, device=device)
train_y = torch.randn(n_train, n_outputs, dtype=torch.float64, device=device)

# Note d=1 instead of n_inputs
test_x_incorrect_dim = torch.randn(n_test, 1, dtype=torch.float64, device=device)

gp_norm = SingleTaskGP(train_x, train_y, input_transform=Normalize(n_inputs))

# This doesn't raise an exception because the input_transform doesn't check the input dimensionality
# and ends up broadcasting up to n_inputs
posterior_incorrect = gp_norm.posterior(test_x_incorrect_dim)

# This shows the issue
print(gp_norm.input_transform.transform(test_x_incorrect_dim).shape)
# torch.Size([16, 4])

# This fails correctly with a sensible error message from GPyTorch
gp = SingleTaskGP(train_x, train_y)
posterior_incorrect_fails = gp.posterior(test_x_incorrect_dim)

** Stack trace/error message **
No error, which is the problem!

Expected Behavior

self._check_shape(X) should always be called during _transform and _untransform.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions