This page contains a list of example codes written with Optuna.
Simplest Codeblock
import optuna
def objective(trial):
x = trial.suggest_float("x", -100, 100)
return x ** 2
if __name__ == "__main__":
study = optuna.create_study()
# The optimization finishes after evaluating 1000 times or 3 seconds.
study.optimize(objective, n_trials=1000, timeout=3)
print(f"Best params is {study.best_params} with value {study.best_value}")
Note
If you are interested in a quick start of Optuna Dashboard with in-memory storage, please take a look at this example.
Tip
Couldn't find your usecase? FAQ might be helpful for you to implement what you want. In this example repository, you can also find the examples for the following scenarios:
-
Objective function with additional arguments, which is useful when you would like to pass arguments besides
trial
to your objective function. -
Manually provide trials with sampler, which is useful when you would like to force certain parameters to be sampled.
-
Callback to control the termination criterion of study, which is useful when you would like to define your own termination criterion other than
n_trials
ortimeout
.
Here are the URLs to the example codeblocks to the corresponding setups.
Simple Black-box Optimization
Multi-Objective Optimization
Machine Learning (Incl. LightGBMTuner and OptunaSearchCV)
- AllenNLP
- AllenNLP (Jsonnet)
- Catalyst
- CatBoost
- Chainer
- ChainerMN
- Dask-ML
- FastAI
- Haiku
- Keras
- LightGBM
- LightGBM Tuner
- PyTorch
- PyTorch Ignite
- PyTorch Lightning
- PyTorch Lightning (DDP)
- RAPIDS
- Scikit-learn
- Scikit-learn OptunaSearchCV
- Scikit-image
- SKORCH
- Tensorflow
- Tensorflow (eager)
- XGBoost
If you are looking for an example of reinforcement learning, please take a look at the following:
Pruning
The following example demonstrates how to implement pruning logic with Optuna.
In addition, integration modules are available for the following libraries, providing simpler interfaces to utilize pruning.
- Pruning with Catalyst Integration Module
- Pruning with CatBoost Integration Module
- Pruning with Chainer Integration Module
- Pruning with ChainerMN Integration Module
- Pruning with FastAI Integration Module
- Pruning with Keras Integration Module
- Pruning with LightGBM Integration Module
- Pruning with PyTorch Integration Module
- Pruning with PyTorch Ignite Integration Module
- Pruning with PyTorch Lightning Integration Module
- Pruning with PyTorch Lightning Integration Module (DDP)
- Pruning with Tensorflow Integration Module
- Pruning with XGBoost Integration Module
- Pruning with XGBoost Integration Module (Cross Validation Version)
Samplers
If you are interested in defining a user-defined sampler, here is an example:
Terminator
Visualization
Distributed Optimization
MLOps Platform
External Projects Using Optuna
- Hugging Face Trainer's Hyperparameter Search
- Allegro Trains
- BBO-Rietveld: Automated Crystal Structure Refinement
- Catalyst
- CuPy
- Hydra's Optuna Sweeper Plugin
- Mozilla Voice STT
- neptune.ai
- OptGBM: A scikit-learn Compatible LightGBM Estimator with Optuna
- Optuna-distributed
- PyKEEN
- RL Baselines Zoo
- Hyperparameter Optimization for Machine Learning, Code Repository for Online Course
- Property-guided molecular optimization using MolMIM with CMA-ES
Important
PRs to add additional real-world examples or projects are welcome!
Our Docker images for most examples are available with the tag ending with -dev
.
For example, PyTorch Simple can be run via:
$ docker run --rm -v $(pwd):/prj -w /prj optuna/optuna:py3.11-dev python pytorch/pytorch_simple.py
Additionally, our visualization example can also be run on Jupyter Notebook by opening localhost:8888
in your browser after executing the following:
$ docker run -p 8888:8888 --rm optuna/optuna:py3.11-dev jupyter notebook --allow-root --no-browser --port 8888 --ip 0.0.0.0 --NotebookApp.token='' --NotebookApp.password=''