bboptpy is a library of algorithms for the optimization of black-box functions.
Main advantages:
- single unified interface for Python with a user-friendly API
- faithful reproductions of classical and modern baselines (of which many are not publicly available elsewhere), with SOTA improvements
- transparent implementation and reproducibility that makes it easy to build upon.
Full documentation and list of supported algorithms and functions is found here.
This package can now be installed directly from pip!
pip install bboptpy
The following algorithms are currently fully supported with Python wrappers:
- Univariate:
- Multivariate:
- Unconstrained:
- Adaptive Coordinate Descent (ACD)
- AMaLGaM IDEA
- Basin Hopping
- Controlled Random Search (CRS)
- Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES):
- Differential Evolution (DE):
- Differential Search (DSA)
- Exponential Natural Evolution Strategy (xNES)
- LIPO Search with Max Heuristic and Local Search (MAX-LIPO-TR)
- Novel Self-Adaptive Harmony Search (NSHS)
- Hessian Evolutionary Strategy (HE-ES)
- Self-Adaptive Multi-Population JAYA
- Adaptive Nelder-Mead Method
- Particle Swarm Optimization (PSO):
- Powell's Methods:
- PRAXIS
- Rosenbrock Method
- Unconstrained:
Simple example to optimize a univariate function:
import numpy as np
from bboptpy import Brent
# function to optimize
def fx(x):
return np.sin(x) + np.sin(10 * x / 3)
alg = Brent(mfev=20000, atol=1e-6)
sol = alg.optimize(fx, lower=2.7, upper=7.5, guess=np.random.uniform(2.7, 7.5))
print(sol)
This will print the following output:
x*: 5.1457349293974861
calls to f: 10
converged: 1
Simple example to optimize a multivariate function:
import numpy as np
from bboptpy import ActiveCMAES
# function to optimize
def fx(x):
return sum((100 * (x2 - x1 ** 2) ** 2 + (1 - x1) ** 2) for x1, x2 in zip(x[:-1], x[1:]))
n = 10 # dimension of problem
alg = ActiveCMAES(mfev=10000, tol=1e-4, np=20)
sol = alg.optimize(fx, lower=-10 * np.ones(n), upper=10 * np.ones(n), guess=np.random.uniform(-10, 10, size=n))
print(sol)
This will print the following output:
x*: 0.999989 0.999999 1.000001 1.000007 1.000020 1.000029 1.000102 1.000183 1.000357 1.000689
objective calls: 6980
constraint calls: 0
B/B constraint calls: 0
converged: yes
The following example illustrates how to run bboptpy optimizers incrementally, returning the control to the Python interpreter between iterations:
import numpy as np
from bboptpy import ActiveCMAES
# function to optimize
def fx(x):
return sum((100 * (x2 - x1 ** 2) ** 2 + (1 - x1) ** 2) for x1, x2 in zip(x[:-1], x[1:]))
n = 10 # dimension of problem
alg = ActiveCMAES(mfev=10000, tol=1e-4, np=20)
alg.initialize(f=fx, lower=-10 * np.ones(n), upper=10 * np.ones(n), guess=np.random.uniform(-10, 10, size=n))
while True:
alg.iterate()
print(alg.solution())
To cite this repository, either use the link in the sidebar, or the following bibtext entry:
@software{gimelfarb2024bboptpy,
author = {Gimelfarb, Michael},
license = {LGPL-2.1+},
title = {{bboptpy}},
url = {https://github.com/mike-gimelfarb/bboptpy},
year = {2024}
}
Please also consider citing the original authors of the algorithms you use, whose papers are linked in the supported algorithms section above.
To cite the authors of the individual algorithms, please see the references in the comments heads of the respective C++ source files.