Skip to content

Latest commit

 

History

History
53 lines (42 loc) · 1.95 KB

README.md

File metadata and controls

53 lines (42 loc) · 1.95 KB

Optimus

An automated machine learning tool.

Installation

pip install optimus-ml

Example usage

from optimus_ml import ModelOptimizer
from sklearn.svm import SVC
from sklearn.datasets import load_iris
import numpy as np

# Load data
data = load_iris()
X = data.data
y = data.target

# Setup classifier
clf = SVC(probability=True, random_state=3)

# Setup parameter grid
param_grid = {
    "C": np.logspace(-10, 10, num=21, base=2).tolist(),
    "gamma": np.logspace(-10, 0, num=11, base=2).tolist(),
    "kernel": ["linear", "poly", "rbf"],
}

# Setup Model Optimizer
opt = ModelOptimizer(estimator=clf, encoded_params=param_grid, inner_cv=10, max_run_time=1500, n_iter=100)

# Fitting...
opt.fit(X, y)

# Print best parameter setting and corresponding score
print(opt.best_params_, opt.best_score_)

Method comparison

Below you'll see a comparison of a few different methods that are available. The graph displays average rank over time (in seconds), where lower ranks are better. We use different classifiers for estimating the expected improvement (EI) and running time. The expected improvement is not calculated when doing a Randomized Search, and the running time is only calculated in the EI/s methods.

Each method was executed three times on the following ten Openml tasks: 12, 14, 16, 20, 22, 28, 32, 41, 45 and 58, using a Random Forest Classifier. For a better comparison of the different methods, each method is pre-seeded with the same knowledge about five hyper parameter settings.