-
Notifications
You must be signed in to change notification settings - Fork 65
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add base implementation of optimum in OptimizerInterface #76
Conversation
Depends on whether the language will accept it, but I'm generally okay with that.
👍 Only question I have is what the default should be. How often do you need a copy, and what are the potential damages by not always returning a copy by default?
|
For the record, I believe this is to address issue #71, which one of the students reported. |
Looks, good. I think this class is the best place for the implementation, though it should never have been named an OptimizerInterface. So if it's not too much work to rename the class to OptimzierBase, I think that would be a nice improvement. |
I think the potential danger is having the copy overhead. I assume that to be negligible, based on no data at all. If you call "optimum" in a tight loop then the copy might become a bottleneck, but I would call the premature optimization. |
Oh, my fear was the other way. By returning the reference, not a copy, then you might risk unintended modification to the internals. I think the risk of it being slow would be somehow more obvious and benign (it just takes longer, not we've accidentally altered the internal data which is harder to track down). |
Oh yeah I totally agree, the overwriting is the much bigger risk. The current implementation basically requires |
Co-authored-by: Brian Kroth <bpkroth@users.noreply.github.com>
Co-authored-by: Brian Kroth <bpkroth@users.noreply.github.com>
|
||
self.logger.info(f"Optimum: {bayesian_optimizer.optimum()}") | ||
self.logger.info(f"Optimum: {bayesian_optimizer.optimum()[1]}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[1]
seems like the more common value to be asking about, maybe it should be returned first?
Also, named values/properties seems maybe better.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
having configuration, optimum makes more sense to me, and I think actually the configuration is the more interesting one, but it's not used in the tests right now because it might not have been available before?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can always log both :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, are there any invariants we can assert about the optima in this test and in the tests below? Specifically:
- That it is in fact the optimum
- That the config belongs to the parameter space
- That the optimum belongs to the objective space
- That the optimum is monotonically decreasing like you did in one of the tests below.
- Anything else you can think of
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm happy to add some of these but I feel that's mostly unrelated to this PR, right? More testing is always good but I'd love to push this to the students.
source/Mlos.Python/mlos/unit_tests/TestBayesianOptimizerGrpcClient.py
Outdated
Show resolved
Hide resolved
source/Mlos.Python/mlos/Optimizers/long_haul_tests/LongHaulTestBayesianOptimizer.py
Outdated
Show resolved
Hide resolved
source/Mlos.Python/mlos/Optimizers/long_haul_tests/LongHaulTestBayesianOptimizer.py
Outdated
Show resolved
Hide resolved
source/Mlos.Python/mlos/Optimizers/long_haul_tests/LongHaulTestBayesianOptimizer.py
Outdated
Show resolved
Hide resolved
had to resolve some merge conflicts so please check again. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks!
This adds an implementation of "optimum" to the BayesianOptimizerProxy based on the public
get_all_observations
.I'm not sure how @byte-sculptor feels about adding implementations to the interface class. This implementation could probably be shared by all derived classes so this felt like a natural place.
Another concern with the current trivial implementation is that right now it adds a copy of the data in
BayesianOptimizer
. That could be easily addressed by adding aallow_view=False
argument toget_all_observations
to not make the copy unless it's required.Obviously still missing tests