Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reordering of parameters within simulate_scenarios? #454

Closed
AVHopp opened this issue Dec 19, 2024 · 3 comments
Closed

Reordering of parameters within simulate_scenarios? #454

AVHopp opened this issue Dec 19, 2024 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@AVHopp
Copy link
Collaborator

AVHopp commented Dec 19, 2024

Based on observation made in #439 , I designed a small example (very quickly handdrafted, so not very clean) demonstrating the error when using continuous variables.

In the following script, the manual loop works as expected, but the second one breaks as it claims that the constraints are not fulfilled.

from gradio_client import Client

from baybe import Campaign
from baybe.constraints import ContinuousLinearConstraint
from baybe.objectives import SingleTargetObjective
from baybe.parameters import CategoricalParameter, NumericalContinuousParameter
from baybe.searchspace import SearchSpace
from baybe.simulation import simulate_scenarios
from baybe.targets import NumericalTarget

client = Client("AccelerationConsortium/crabnet-hyperparameter")

TOLERANCE = 0.01


def adv_opt(c1, c2, c3, x6, x15, x19, x20):
    result = client.predict(
        0.1,
        0.1,
        0.1,
        0.1,
        0.1,
        x6,
        0.1,
        0.1,
        0.1,
        0.1,
        0.1,
        0.1,
        0.1,
        0.1,
        x15,
        0.1,
        0.1,
        0.1,
        x19,
        x20,
        c1,
        c2,
        c3,
        0.5,
        api_name="/predict",
    )
    return result["data"][0][0]  # return y1 value only


parameters = [
    NumericalContinuousParameter(name="x6", bounds=(0.0, 1.0)),
    NumericalContinuousParameter(name="x15", bounds=(0.0, 1.0)),
    NumericalContinuousParameter(name="x19", bounds=(0.0, 1.0)),
    NumericalContinuousParameter(name="x20", bounds=(0.0, 1.0)),
    CategoricalParameter(name="c1", values=["c1_0", "c1_1"], encoding="INT"),
    CategoricalParameter(name="c2", values=["c2_0", "c2_1"], encoding="INT"),
    CategoricalParameter(name="c3", values=["c3_0", "c3_1", "c3_2"], encoding="INT"),
]

constraints = [
    ContinuousLinearConstraint(
        parameters=["x19", "x20"],
        coefficients=[1.0, -1.0],
        rhs=-TOLERANCE,
        operator="<=",
    ),
    ContinuousLinearConstraint(
        parameters=["x6", "x15"],
        coefficients=[1.0, 1.0],
        rhs=1 - TOLERANCE,
        operator="<=",
    ),
]

searchspace = SearchSpace.from_product(parameters=parameters, constraints=constraints)
objective = SingleTargetObjective(target=NumericalTarget(name="Target", mode="MIN"))

campaign = Campaign(searchspace=searchspace, objective=objective)
for _ in range(3):
    rec = campaign.recommend(batch_size=1)
    assert rec["x19"].item() <= rec["x20"].item()
    assert rec["x6"].item() + rec["x15"].item() <= 1
    res = adv_opt(
        rec["c1"].item(),
        rec["c2"].item(),
        rec["c3"].item(),
        rec["x6"].item(),
        rec["x15"].item(),
        rec["x19"].item(),
        rec["x20"].item(),
    )
    rec["Target"] = res
    campaign.add_measurements(rec)

print("Manual loop successful.")
print(campaign.measurements)

df_result = simulate_scenarios(
    {"Default Recommender": Campaign(searchspace=searchspace, objective=objective)},
    adv_opt,
    batch_size=1,
    n_doe_iterations=3,
    n_mc_iterations=2,
)

However, when the variable x6 is renamed to x06, then both loops work as intended. This seems to be related to some sort of reordering happening, since the print shows that the order of columns does not correspond to the order of declared parameters.

Manual loop successful.
     c1    c2    c3       x15       x19       x20        x6    Target  BatchNr  FitNr
0  c1_0  c2_1  c3_0  0.017021  0.402676  0.698488  0.601754  0.519219        1    1.0
1  c1_1  c2_0  c3_2  0.603483  0.320710  0.580148  0.177987  0.592556        2    2.0
2  c1_0  c2_1  c3_0  0.005322  0.506055  0.671234  0.557078  0.565181        3    NaN
´´´
@AdrianSosic
Copy link
Collaborator

Tagging @ritalyu17 here for transparency.

@Scienfitz
Copy link
Collaborator

could the culprit be the deepcopy being done inside the simulation utility?

Should confirm the actual order of parameters after creation in i) the manual campaign and ii) one of the copied campaigns inside the simulation

@Scienfitz Scienfitz added the bug Something isn't working label Dec 19, 2024
@AdrianSosic
Copy link
Collaborator

I know what the problem is and have verified it by testing. Nothing wrong with the simulation module and nothing wrong with the constraints – the lookup is to blame. It's exactly the bad design that was just changed in #441: the parameters are reordered internally, but the lookup is declared with positional arguments and the original ordering in mind.

Now that #441 is merged, the problem will automatically disappear once the lookup mechanism of the benchmark gets updated, which has to happen during rebase / backward merge. Hence, I'll mark this issue as resolved, since it is essentially fixed on main.

@AdrianSosic AdrianSosic self-assigned this Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants