Real-time dashboard for Optuna. Code files were originally taken from Goptuna.
You can install optuna-dashboard via PyPI or Anaconda Cloud.
$ pip install optuna-dashboard
And you can also install following optional dependencies to make optuna-dashboard faster.
$ pip install optuna-fast-fanova gunicorn
Then please execute optuna-dashboard
command with Optuna storage URL.
$ optuna-dashboard sqlite:///db.sqlite3
Listening on http://localhost:8080/
Hit Ctrl-C to quit.
More command line options
$ optuna-dashboard -h
usage: optuna-dashboard [-h] [--port PORT] [--host HOST] [--version] [--quiet] storage
Real-time dashboard for Optuna.
positional arguments:
storage DB URL (e.g. sqlite:///example.db)
optional arguments:
-h, --help show this help message and exit
--port PORT port number (default: 8080)
--host HOST hostname (default: 127.0.0.1)
--server {wsgiref,gunicorn}
server (default: auto)
--version, -v show program's version number and exit
--quiet, -q quiet
Python Interface
run_server(storage: Union[str, BaseStorage], host: str = 'localhost', port: int = 8080) -> None
Start running optuna-dashboard and blocks until the server terminates. This function uses wsgiref module which is not intended for the production use.
wsgi(storage: Union[str, BaseStorage]) -> WSGIApplication
This function exposes WSGI interface for people who want to run on the production-class WSGI servers like Gunicorn or uWSGI.
You can also use an official Docker image instead of setting up your Python environment. The Docker image only supports SQLite3, MySQL(PyMySQL), and PostgreSQL(Psycopg2).
$ docker run -it --rm -p 8080:8080 -v `pwd`:/app -w /app \
> ghcr.io/optuna/optuna-dashboard sqlite:///db.sqlite3
MySQL (PyMySQL)
$ docker run -it --rm -p 8080:8080 ghcr.io/optuna/optuna-dashboard mysql+pymysql://username:password@hostname:3306/dbname
PostgreSQL (Psycopg2)
$ docker run -it --rm -p 8080:8080 ghcr.io/optuna/optuna-dashboard postgresql+psycopg2://username:password@hostname:5432/dbname
You can create and delete studies from Dashboard.
Interactive live-updating graphs for optimization history, parallel coordinate, intermediate values and hyperparameter importances.
You can walk-through trials by filtering and sorting.
If you want to contribute, please check Developers Guide.