Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom SOBO objectives #229

Merged
merged 8 commits into from
Jul 3, 2023
Merged

Custom SOBO objectives #229

merged 8 commits into from
Jul 3, 2023

Conversation

simonsung06
Copy link
Contributor

@simonsung06 simonsung06 commented Jun 29, 2023

Proposal to have custom functions for calculating SOBO objective values (c.f. AdditiveSoboStrategy, MultiplicativeSoboStrategy)

Functions can be dumped and loaded similarly to Empirical models

Still worth discussing as I think how weights for individual output features is not well handled at the moment. Happy for suggestions here. Maybe a choice for the user to have the weights be multiplied or be exponents? Or even more different: the function provided by the user should take the callables and weights as arguments so it becomes completely customisable.

Copy link
Contributor

@jduerholt jduerholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks very promising, I let some minor comments.

from typing import List, Union

import cloudpickle
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cloudpickle is not a mandatory dependency, we have to guard the import here. Have a look at the empirical surrogate.

if data_model.dump is not None:
self.loads(data_model.dump)
else:
self.f = None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea of not raising an error already here is to give the user the possibility to provide it at runtime?

if self.f is None:
raise ValueError("No function has been provided for the strategy")
return GenericMCObjective(
objective=get_custom_botorch_objective(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One could think about having the possibility of output constraints here via a flag as for example in AdditiveSobo, what do you think?

for i, feat in enumerate(outputs.get())
if feat.objective is not None # type: ignore
]
weights = [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The question is if we should not just ignore the weights. If you go custom, then the weights do not apply, what do you think?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, the callable gets the weights and then has to decide how to handle them? I think this is even the best ..., or? Then the user can decide if he wants to use weights or not ...

)
strategy2 = CustomSoboStrategy(data_model=data_model2)
strategy2.loads(f_str)
assert isinstance(strategy2.f, type(f))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we want to test here that evaluating objective of strategy amounts to the same as evaluating the loaded objective of strategy2?

@simonsung06
Copy link
Contributor Author

Thanks for the suggestions. Responses to your comments:

  1. I have placed import cloudpickle behind a try except
  2. For not raising an error during the __init__ of CustomSoboStrategy, actually that's a good suggestion. I personally prefer that since the f is a requirement for making use of the class anyway. What about something like this with these ValueError's:
class CustomSoboStrategy(SoboStrategy):
    def __init__(
        self,
        data_model: CustomDataModel,
        f: Optional[
            Callable[
                [
                    Tensor,
                    List[Callable[[Tensor, Optional[Tensor]], Tensor]],
                    List[float],
                    Tensor,
                ],
                Tensor,
            ]
        ] = None,
        **kwargs,
    ):
        super().__init__(data_model=data_model, **kwargs)
        self.use_output_constraints = data_model.use_output_constraints
        if f is None and data_model.dump is None:
            raise ValueError('A function in `f` or cloudpickled function string in `data_model.dump` must be provided')
        
        if f is not None and data_model.dump is not None:
            raise ValueError('Both `f` and `data_model.dump` were provided. You may only use either of these')
        
        if data_model.dump is not None:
            self.loads(data_model.dump)
        else:
            self.f = f
  1. I have added output constraints to the strategy. As it is now, the user just has to be aware of their relative positions when they add them to the domain so that they can account for the missing objectives in their provided function
  2. My preference is that weights are retained just in case the users happens to want them, even if they may seem reductant since a custom function is provided anyway.
  3. Assertion has been updated to compare the objective outputs too now

@jduerholt
Copy link
Contributor

2. For not raising an error during the __init__ of CustomSoboStrategy, actually that's a good suggestion. I personally prefer that since the f is a requirement for making use of the class anyway. What about something like this with these ValueError's:

class CustomSoboStrategy(SoboStrategy):
    def __init__(
        self,
        data_model: CustomDataModel,
        f: Optional[
            Callable[
                [
                    Tensor,
                    List[Callable[[Tensor, Optional[Tensor]], Tensor]],
                    List[float],
                    Tensor,
                ],
                Tensor,
            ]
        ] = None,
        **kwargs,
    ):
        super().__init__(data_model=data_model, **kwargs)
        self.use_output_constraints = data_model.use_output_constraints
        if f is None and data_model.dump is None:
            raise ValueError('A function in `f` or cloudpickled function string in `data_model.dump` must be provided')
        
        if f is not None and data_model.dump is not None:
            raise ValueError('Both `f` and `data_model.dump` were provided. You may only use either of these')
        
        if data_model.dump is not None:
            self.loads(data_model.dump)
        else:
            self.f = f

From my side, this is not needed, my comment was more made as an explanation to myself in contrast to being a suggestion, because you can already setup the datamodel with f=None, instatiate the strategy with

strategy = CustomSoboStrategy(data_model=data_model)
strategy.f = func

So no change needed.

Copy link
Contributor

@jduerholt jduerholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@jduerholt jduerholt merged commit b57d6bc into main Jul 3, 2023
@jduerholt jduerholt deleted the custom_sobo_objetives branch July 3, 2023 10:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants