Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mixed alternating optimizer #1987

Closed
wants to merge 1 commit into from
Closed

Conversation

Balandat
Copy link
Contributor

Summary: Adds an optimizer for settings with mixed (continuous and discrete) variables, that interleaves gradient-based optimization (for fixed discrete variables) with nearest-neighbor search over discrete variables (for fixed continuous variables). This will need more cleanup and testing.

Differential Revision: D48419691

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Aug 17, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48419691

"maxiter": options.get("maxiter_continuous", MAX_ITER_CONT),
"tol": options.get("tol", CONVERGENCE_TOL),
"batch_limit": options.get("batch_limit", MAX_BATCH_SIZE),
},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

options={
            "maxiter": options.get("maxiter_continuous", MAX_ITER_CONT),
            "tol": options.get("tol", CONVERGENCE_TOL),
            "batch_limit": options.get("batch_limit", MAX_BATCH_SIZE),
        },

These options gets passed as part of the options dictionary to scipy's optimize.minimize(...) in botorch.optim.utils.timeout.py:

return optimize.minimize(
            fun=fun,
            x0=x0,
            args=args,
            method=method,
            jac=jac,
            hess=hess,
            hessp=hessp,
            bounds=bounds,
            constraints=constraints,
            tol=tol,
            callback=wrapped_callback,
            options=options,
        )

In other words, tol here is None. To change the tolerance of the lbfgsb minimizer, "gtol" and/or "ftol" should be passed in:

options={
            "maxiter": options.get("maxiter_continuous", MAX_ITER_CONT),
            "gtol": options.get("tol", CONVERGENCE_TOL),
            "ftol": options.get("tol", CONVERGENCE_TOL),
            "batch_limit": options.get("batch_limit", MAX_BATCH_SIZE),
        },

Summary:

Adds an optimizer for settings with mixed (continuous and discrete) variables, that interleaves gradient-based optimization (for fixed discrete variables) with nearest-neighbor search over discrete variables (for fixed continuous variables). This will need more cleanup and testing.

Differential Revision: D48419691
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48419691

@Balandat Balandat closed this Mar 5, 2024
@Balandat Balandat deleted the export-D48419691 branch March 5, 2024 14:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants