Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(VariationalHierarchicalHyperparameters, DirichletKernelMulticlassClassification) decorator combination test causes unrelated tests to fail #378

Open
rg936672 opened this issue Aug 20, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@rg936672
Copy link
Contributor

What's the problem?

This pair of decorators is normally excluded from the combination tests due to "can't aggregate multitask predictions". However, removing this disable does not only introduce two failing tests (as would be expected, one for each ordering of the two decorators), it also causes several unrelated combinations DirichletMulticlassClassification to fail. This indicates that the tests are not properly isolated from one another..

How can we reproduce the issue?

Comment out the lines

    (
        VariationalHierarchicalHyperparameters,
        DirichletMulticlassClassification,
    ),  # can't aggregate multitask predictions

in test_decorator_combinations.py (in the definition of EXCLUDED_COMBINATIONS). Run the tests. Observe that unrelated tests (which passed before) now fail.

Python version

3.12

Package version

2.1.0

Operating system

No response

Other packages

No response

Relevant log output

For example:

FAILED [ 14%]
tests\units\test_decorator_combinations.py:186 (test_combinations[Upper: DirichletMulticlassClassification - Lower: SetWarp])
test_decorator_combinations.py:256: in test_combinations
    controller.fit(1)
..\..\vanguard\base\gpcontroller.py:148: in fit
    loss = self._sgd_round(n_iters=n_sgd_iters, gradient_every=gradient_every)
..\..\vanguard\warps\decorator.py:187: in _sgd_round
    loss = super()._sgd_round(n_iters=n_iters, gradient_every=gradient_every)
..\..\vanguard\base\basecontroller.py:381: in _sgd_round
    raise err
..\..\vanguard\base\basecontroller.py:364: in _sgd_round
    loss = self._single_optimisation_step(train_x, train_y, retain_graph=iter_num < n_iters - 1)
..\..\vanguard\base\basecontroller.py:405: in _single_optimisation_step
    loss = self._loss(x, y)
..\..\vanguard\classification\dirichlet.py:234: in _loss
    return super()._loss(train_x, train_y).sum()
..\..\vanguard\warps\decorator.py:207: in _loss
    nmll = super()._loss(train_x, warped_train_y)
..\..\vanguard\base\basecontroller.py:426: in _loss
    return -self._mll(output, train_y.squeeze(dim=-1))
..\..\vanguard\base\basecontroller.py:602: in __call__
    result = super().__call__(*args, **kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\module.py:31: in __call__
    outputs = self.forward(*inputs, **kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\mlls\exact_marginal_log_likelihood.py:66: in forward
    output = self.likelihood(function_dist, *params, **kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\likelihoods\gaussian_likelihood.py:469: in __call__
    return super().__call__(input, *args, **kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\likelihoods\likelihood.py:76: in __call__
    return self.marginal(input, *args, **kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\likelihoods\gaussian_likelihood.py:461: in marginal
    return super().marginal(function_dist, *args, **kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\likelihoods\gaussian_likelihood.py:359: in marginal
    return super().marginal(function_dist, *args, **kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\likelihoods\gaussian_likelihood.py:117: in marginal
    full_covar = covar + noise_covar
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\linear_operator\operators\_linear_operator.py:2774: in __add__
    return AddedDiagLinearOperator(self, other)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\lazy\lazy_tensor.py:46: in __init__
    return __orig_init__(self, *args, **new_kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\linear_operator\operators\added_diag_linear_operator.py:43: in __init__
    super(AddedDiagLinearOperator, self).__init__(*linear_ops, preconditioner_override=preconditioner_override)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\gpytorch\lazy\lazy_tensor.py:46: in __init__
    return __orig_init__(self, *args, **new_kwargs)
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\linear_operator\operators\sum_linear_operator.py:23: in __init__
    batch_shape = torch.broadcast_shapes(*[lt.batch_shape for lt in linear_ops])
..\..\..\..\.virtualenvs\Vanguard-ShqWj01e\Lib\site-packages\torch\functional.py:133: in broadcast_shapes
    raise RuntimeError("Shape mismatch: objects cannot be broadcast to a single shape")
E   RuntimeError: Shape mismatch: objects cannot be broadcast to a single shape
@rg936672 rg936672 added bug Something isn't working new Something yet to be discussed by development team labels Aug 20, 2024
@tp832944 tp832944 removed the new Something yet to be discussed by development team label Aug 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants