Skip to content

aprams/autotune

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

81 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

autotune

This project provides an implementation and comparison of popular hyperparameter optimization strategies. Currently the following optimizers are supported:

  • Grid Search
  • Random Search
  • Genetic Algorithms
  • Gaussian Processes (adapting BayesianOptimization)
  • Tree-structured Parzen Estimators (adapting HyperOpt)

Structure

  • autotune/ - contains the optimizer implementations
  • experiments/ - contains a number of experiments on three datasets (call first call the xy_eval.py, then xy_plot.py for any experiment)
  • tests/ - basic tests for the optimizers to ensure their functionality and initially recorded performance

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

Installing

In order to test and start developing, you will need to install the requirements, preferably in a virtual environment:

virtualenv -p python3.6 .venv
source .venv/bin/activate
python -m pip install -r requirements.txt

Run the notebooks by calling

jupyter notebook

and selecting the notebook.

Example Code

from autotune import param_space
from autotune.optimizers import gp_search

a = param_space.Real([-5, 5], name='real_var', n_points_to_sample=20)
b = param_space.Real([-5, 0], projection_fn=lambda x: 10 ** x, name='real_var_2', n_points_to_sample=20)

def sample_eval_fn(params):
    return params['real_var'] * params['real_var_2']

optimizer = gp_search.GaussianProcessOptimizer([a, b], sample_eval_fn, n_iterations=10, random_seed=0)
results = optimizer.maximize()
print("Best value of {0} achieved with parameters {1}".format(results[0][1], dict(results[0][0]))) # sorted list of parameter combination -> eval_fn output mapping

Built With

  • BayesianOptimization - Framework for Gaussian Processes based optimization
  • HyperOpt - Hyperparameter Optimization framework, used for TPEs in this project
  • Numpy - NumPy is the fundamental package for array computing with Python.
  • Scipy - Scientific Library for Python
  • Matplotlib - Python plotting package
  • Seaborn - Statistic data visualization package

Authors

License

This project is licensed under the MIT License - see the LICENSE file for details

Acknowledgments

  • BayesianOptimization and HyperOpt authors for their great frameworks, which made the development of autotune much easier

About

Automated Hyperparameter Tuning Framework

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published