Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support multiple data fidelity dimensions in MultiFidelityGP models #1956

Closed
wants to merge 22 commits into from

Conversation

AlexanderMouton
Copy link
Contributor

@AlexanderMouton AlexanderMouton commented Jul 30, 2023

Support multiple data fidelity dimensions in MultiFidelityGP models

Motivation

Both the SingleTaskMultiFidelityGP and the FixedNoiseMultiFidelityGP only allowed for one data-fidelity dimension. This PR generalises them by adding the option of passing in a list of data fidelity dimensions. In the case of multiple data fidelity dimensions, a kernel is created for each dimension and added to a list, which is passed into a product kernel. This is useful for situations such as was discussed in #1942 with @Balandat.

Have you read the Contributing Guidelines on pull requests?

Yes.

Test Plan

I changed code in botorch/models/gp_regression_fidelity.py and the accompanying test code in test/models/test_gp_regression_fidelity.py. The logic remains unchanged in the case of a single data-fidelity dimension. The changes for allowing multiple fidelity dimensions were tested by adding appropriate cases in the FIDELITY_TEST_PAIRS list in the TestSingleTaskMultiFidelityGP class.

Furthermore, the changes were tested on a minimisation toy problem with two data-fidelity dimensions and used to create the following plots:

Screenshot from 2023-07-30 14-11-19

image

where g11 has fidelities [1.0, 1.0], g01 has fidelities [0.75, 1.0], g10 has fidelities [1.0, 0.75] and g00 has fidelities [0.75, 0.75].

Related PRs

N/A

…o allow for multiple data fidelity dimensions
@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Jul 30, 2023
@codecov
Copy link

codecov bot commented Aug 5, 2023

Codecov Report

Merging #1956 (da652d7) into main (b757b20) will not change coverage.
Report is 9 commits behind head on main.
The diff coverage is 100.00%.

❗ Current head da652d7 differs from pull request most recent head c56eb14. Consider uploading reports for the commit c56eb14 to get more accurate results

@@            Coverage Diff            @@
##              main     #1956   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          178       178           
  Lines        15748     15739    -9     
=========================================
- Hits         15748     15739    -9     
Files Changed Coverage Δ
botorch/acquisition/knowledge_gradient.py 100.00% <ø> (ø)
botorch/acquisition/max_value_entropy_search.py 100.00% <ø> (ø)
botorch/acquisition/multi_step_lookahead.py 100.00% <ø> (ø)
botorch/models/gp_regression_mixed.py 100.00% <ø> (ø)
botorch/acquisition/input_constructors.py 100.00% <100.00%> (ø)
botorch/acquisition/joint_entropy_search.py 100.00% <100.00%> (ø)
botorch/acquisition/monte_carlo.py 100.00% <100.00%> (ø)
botorch/acquisition/multi_objective/monte_carlo.py 100.00% <100.00%> (ø)
botorch/acquisition/utils.py 100.00% <100.00%> (ø)
botorch/models/gp_regression_fidelity.py 100.00% <100.00%> (ø)
... and 3 more

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this and the thorough contribution. I have mostly cosmetic comments as well as a suggestion for how to accept data fidelities going forward (which may make fixing the lint errors unnecessary).

@@ -66,7 +66,7 @@ def __init__(
train_X: Tensor,
train_Y: Tensor,
iteration_fidelity: Optional[int] = None,
data_fidelity: Optional[int] = None,
data_fidelity: Optional[int | List[int]] = None,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While I like this, the | operator only works in py3.10+, and we still need to be compatible with py3.9

Suggested change
data_fidelity: Optional[int | List[int]] = None,
data_fidelity: Optional[Union[int, List[int]]] = None,

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could also make sense to have a data_fidelities argument that only accepts lists (or tuples) and require people to use that even for the single-fidelity case, that way we wouldn't have to overload the arg types. This could be done nicely by keeping the data_fidelity arg around but marking it as deprecated (and raise a deprecation warning if used in the constructor). Wdyt?

cc @saitcakmak, @esantorella for thoughts.

botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
botorch/models/gp_regression_fidelity.py Outdated Show resolved Hide resolved
@Balandat Balandat changed the title generalises SingleTaskMultiFidelityGP and FixedNoiseMultiFidelityGP t… Support multiple data fidelity dimensions in MultiFidelityGP models Aug 8, 2023
esantorella and others added 6 commits August 8, 2023 18:01
Summary:
Pull Request resolved: pytorch#1972

## Motivation

Made a couple minor corrections to the BOPE tutorial, bringing the text in line with the code. The code and text probably drifted apart over time.

### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)?

Yes

Pull Request resolved: pytorch#1971

Test Plan: Check that it renders okay in GH

Reviewed By: esantorella

Differential Revision: D48175913

Pulled By: saitcakmak

fbshipit-source-id: f312366955ffc69683b06c37ea971e595f1cf738
Summary:
Pull Request resolved: pytorch#1945

This diff deprecates `FixedNoiseDataset` and merges it into `SupervisedDataset` with Yvar becoming an optional field. This also simplifies the class hierarchy a bit, removing `SupervisedDatasetMeta` in favor of an `__init__` method.

I plan to follow up on this by adding optional metric names to datasets and introducing a MultiTaskDataset, which will simplify some of the planned work in Ax MBM.

Reviewed By: esantorella

Differential Revision: D47729430

fbshipit-source-id: 551cd78a02755505573b10ea1f075aa21f838ab7
Summary:
X-link: facebook/Ax#1771

Pull Request resolved: pytorch#1973

Currently, constraints are not used in single objective AFs in MBM due to a name mismatch between `outcome_constraints` and `constraints`.

Reviewed By: SebastianAment

Differential Revision: D48176978

fbshipit-source-id: 9495708002c11a874bb6b8c06327f0f4643039df
Summary: see title

Reviewed By: saitcakmak

Differential Revision: D48230494

fbshipit-source-id: ae5a2355a4ffe30357fdc4a333e3235e6106b6a0
Summary: `posterior_transform` is `Optional` in the constructor of the acquisition function, so it should be optional in the private methods as well.

Reviewed By: saitcakmak

Differential Revision: D48244486

fbshipit-source-id: f0850ecde5259d626cacbacd8c36401483a69d15
Summary:
see title. Deploy on release failed due to running on py3.8

https://github.com/pytorch/botorch/actions/runs/5826595266/job/15801310995#step:6:37

Reviewed By: saitcakmak, Balandat

Differential Revision: D48250526

fbshipit-source-id: 11412ad94bd237957f7e61c24f2f7bd890b630b1
@Balandat
Copy link
Contributor

Thanks for making these edits. The only outstanding things would be to

  1. rebase on the most recent changes in main,
  2. add a couple of tests to bring the coverage back up to 100% (via assertWarnsRegex and assertRaisesRegex to test the new deprecation / error setup)
  3. update existing call sites in the tutorials, namely
    " train_x, train_obj, outcome_transform=Standardize(m=1), data_fidelity=6\n",
    and
    " train_x, train_obj, outcome_transform=Standardize(m=1), data_fidelity=6\n",

esantorella and others added 7 commits August 17, 2023 10:55
…d set of exceptions (pytorch#1872)

Summary:
X-link: facebook/Ax#1772

Pull Request resolved: pytorch#1872

[x] Remove unused arguments from input constructors and related functions. The idea is especially not to let unused keyword arguments disappear into `**kwargs` and be silently ignored
[x] add arguments to some input constructors so they don't need any `**kwargs`
[x] Add a decorator that ensures that each input constructor can accept a certain set of keyword arguments, even if those are not used are the constructor, while still erroring on
[ ] Prevent arguments from having different defaults in the input constructors as in acquisition functions

Reviewed By: lena-kashtelyan

Differential Revision: D46519588

fbshipit-source-id: 8d727cb991f2899cc9e7ee68da7998e981ca8802
Summary:
X-link: facebook/Ax#1782

Pull Request resolved: pytorch#1985

See previous diff

Reviewed By: lena-kashtelyan

Differential Revision: D48338443

fbshipit-source-id: f5f674fe224cf5a1ef06a6de9cdad10dfda09382
…o allow for multiple data fidelity dimensions
@AlexanderMouton
Copy link
Contributor Author

Hi @Balandat

I've restored the code coverage to 100%, made the changes to the tutorials' documentation and rebased to the newest changes. :)

@facebook-github-bot
Copy link
Contributor

@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@Balandat
Copy link
Contributor

Hmm not sure why the doc build is failing, seems like there is an issue with the torch install?

Traceback (most recent call last):
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/site-packages/sphinx/ext/autodoc/importer.py", line 64, in import_module
    return importlib.import_module(modname)
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 850, in exec_module
  File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/site-packages/botorch/__init__.py", line 7, in <module>
    import gpytorch.settings as gp_settings
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/site-packages/gpytorch/__init__.py", line 5, in <module>
    import linear_operator
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/site-packages/linear_operator/__init__.py", line 2, in <module>
    from . import beta_features, operators, settings, utils
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/site-packages/linear_operator/beta_features.py", line 5, in <module>
    from .settings import _feature_flag
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/site-packages/linear_operator/settings.py", line 5, in <module>
    import torch
  File "/opt/hostedtoolcache/Python/3.9.17/x64/lib/python3.9/site-packages/torch/__init__.py", line 1157, in <module>
    from torch._C._VariableFunctions import *  # type: ignore[misc] # noqa: F[40](https://github.com/pytorch/botorch/actions/runs/5900934059/job/16014069040#step:6:41)3
ModuleNotFoundError: No module named 'torch._C._VariableFunctions'; 'torch._C' is not a package

I'll retry hopefully this is just a temporary issue with the package of the GHA runner.

Summary:
Pull Request resolved: pytorch#1990

Sphinx 7.2.0 or later leads to doc build failures.

Reviewed By: esantorella

Differential Revision: D48487417

fbshipit-source-id: d2ceafa62e94f6a19f82529e868bd342cb691eb9
@AlexanderMouton
Copy link
Contributor Author

Hi @Balandat

Is this an issue that I can fix from my side?

@Balandat
Copy link
Contributor

This turned out to be an issue caused by a sphinx version bump and was fixed in #1990 - if you rebase on the latest changes in master the issue should go away.

@AlexanderMouton
Copy link
Contributor Author

I've rebased to main again, so it should be fixed now @Balandat

@facebook-github-bot
Copy link
Contributor

@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@Balandat merged this pull request in 67fd49b.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants