-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restarting optimization #7
Comments
This clearly needs better documantation; thanks for raising the issue. To restart the optimization one does not need to define a new optimization object, i.e.
will run Bayesian optimization three times for |
Makes sense, but then that begs the question: what if we already have a |
Good point! I'm pretty busy at the moment, but I intend to have a look at this in June. Locally I also experimented with replacing Latin Hypercube Sampling with Sobol sequences, so I may address these issues together. |
That sounds good. I frequently need to perform optimization on Have you considered using https://github.com/MrUrq/LatinHypercubeSampling.jl ? The sampling points are optimized under a distance metric, it would be interesting to compare against the current implementation. |
If I want to restart the optimization, I might do something like (assume the following has been set up with the code in the README:
SETUP
RESTART OPTIMIZATION
This currently gives the following error:
If I set
lhs_iterations=1
, and run for a few cycles, it seems that the optimization forgets about the previous optima:which reports
observed_optimum = -0.8571058809745942
, compared to the optimum pre-restart ofobserved_optimum = -2.720197115023963
, whoseobserved_optimizer
is still in the modelWhich leads to the following: is there a good way to restart optimizations currently (maybe I missed something in the source), and, if this is the best way to do so, can the above issues be fixed?
The text was updated successfully, but these errors were encountered: