-
Notifications
You must be signed in to change notification settings - Fork 391
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Make gen_batch_initial_conditions
more flexible
#1776
Labels
enhancement
New feature or request
Comments
Sounds like a great improvement to me! Happy to review a PR for this. |
facebook-github-bot
pushed a commit
that referenced
this issue
Apr 12, 2023
Summary: <!-- Thank you for sending the PR! We appreciate you spending the time to make BoTorch better. Help us understand your motivation by explaining why you decided to make this change. You can learn more about contributing to BoTorch here: https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md --> ## Motivation This PR adds the feature regarding additional flexibility of `gen_batch_initial_conditions` as discussed in issue #1776. ### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)? Yes. Pull Request resolved: #1779 Test Plan: Unit tests. Reviewed By: SebastianAment, esantorella Differential Revision: D44739865 Pulled By: Balandat fbshipit-source-id: ab805a547415d56bde35650da84bf898c3b97418
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
🚀 Feature Request
Motivation
When optimizing an acqf it could be possible that the default starting point sampler is not sufficient (for example when dealing with non-linear constraints or NChooseK constraints). In these case one can provide a initializer method via the
ic_generator
argument or samples directly via thebatch_initial_conditions
keyword. When looking at the default generatorgen_batch_initial_conditions
one sees that there is lot of logic besides simple sampling included, namely boltzmann weighting of the samples, catching of zero acqf values etc.I propose to make this logic more broadly available to have a easy way to use it in situations where the default intialization does not work and custom samplers has to be used.
Pitch
For this purpose, I propose to add a new optional argument to
gen_batch_initial_conditions
which accepts a callable of typeCallable[[int, int], Tensor]
that just returns samples that are then further processed (boltzmann weighted based on the acqf values etc.) The two arguments of the function would ben
andq
and the returned Tensor would be of shapen x q x d
.What do you think?
Are you willing to open a pull request? Yes.
The text was updated successfully, but these errors were encountered: