Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

[tuner] Regularized Evolution #2802

Merged
merged 37 commits into from
Oct 10, 2020
Merged
Show file tree
Hide file tree
Changes from 7 commits
Commits
Show all changes
37 commits
Select commit Hold shift + click to select a range
afcd8b1
evo nas tuner
tabVersion Aug 19, 2020
a779025
code format
tabVersion Aug 19, 2020
60fb528
add doc
tabVersion Aug 22, 2020
74de3e7
format fix
tabVersion Aug 22, 2020
6232ee2
add comments
tabVersion Aug 23, 2020
17f3f3f
deal with None in InputChoice
tabVersion Aug 23, 2020
179015b
Update evo_nas_tuner.py
tabVersion Sep 3, 2020
3147811
Update evo_nas_tuner.py
tabVersion Sep 3, 2020
d1a5f3e
move evo nas tuner to NAS folder
tabVersion Sep 3, 2020
03b6ece
fix
tabVersion Sep 3, 2020
071ba3b
add to builtin
tabVersion Sep 4, 2020
275e4f2
add UT
tabVersion Sep 9, 2020
1626643
update it
tabVersion Sep 10, 2020
b838829
revert test_builtin_tuners
tabVersion Sep 22, 2020
11f375e
add ut
tabVersion Sep 22, 2020
ea92272
add it trigger
tabVersion Sep 22, 2020
32689a4
Merge branch 'master' into evo-nas-tuner
tabVersion Sep 22, 2020
24cfd1e
add warning message
tabVersion Sep 22, 2020
5cf87e7
fix it
tabVersion Sep 22, 2020
1db6f16
fix typo
tabVersion Sep 22, 2020
d660d55
rename to RegularizedEvolutionTuner
tabVersion Sep 23, 2020
fb3edd9
it update
tabVersion Sep 23, 2020
735d488
comments update
tabVersion Sep 23, 2020
6b6655a
adjust trial num
tabVersion Sep 24, 2020
34d8524
typo fix
tabVersion Sep 25, 2020
0ebb4a7
Merge branch 'evo-nas-tuner' of https://github.com/tabVersion/nni int…
tabVersion Sep 25, 2020
0045b21
Delete __init__.pye
tabVersion Sep 25, 2020
59502a9
extract logic of mutation
tabVersion Sep 25, 2020
1f04d3f
reduce it time cost
tabVersion Sep 25, 2020
1218d21
remove trailing space
tabVersion Sep 25, 2020
fff90d4
change tuner name
tabVersion Sep 27, 2020
9849872
remove regularized nas tuner from data import
tabVersion Sep 28, 2020
d2241b4
handle failure
tabVersion Sep 28, 2020
651ac8a
Merge branch 'evo-nas-tuner' of https://github.com/tabVersion/nni int…
tabVersion Sep 28, 2020
57d4660
directory change
Oct 9, 2020
4e816fc
rename _from to _from_initial
Oct 9, 2020
c508fae
refactor ut
Oct 9, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions docs/en_US/Tuner/EvolutionNasTuner.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# Evolution NAS Tuner on NNI

## EvoNasTuner

This is a tuner geared for NNI’s Neural Architecture Search (NAS) interface. It uses the [evolution algorithm](https://arxiv.org/pdf/1802.01548.pdf).

The tuner first randomly initializes the number of `population` models and evaluates them. After that, every time to produce a new architecture, the tuner randomly chooses the number of `sample` architectures from `population`, then mutates the best model in `sample`, the parent model, to produce the child model. The mutation includes the hidden mutation and the op mutation. The hidden state mutation consists of replacing a hidden state with another hidden state from within the cell, subject to the constraint that no loops are formed. The op mutation behaves like the hidden state mutation as far as replacing one op with another op from the op set. Note that keeping the child model the same as its parent is not allowed. After evaluating the child model, it is added to the tail of the `population`, then pops the front one.

The whole procedure is summarized by the pseudocode below.

![](../../img/EvoNasTuner.png)
1 change: 1 addition & 0 deletions docs/en_US/builtin_tuner.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ Tuner receives metrics from `Trial` to evaluate the performance of a specific pa
Random Search <Tuner/HyperoptTuner>
Anneal <Tuner/HyperoptTuner>
Naive Evolution <Tuner/EvolutionTuner>
Evolution Tuner <Tuner/EvolutionNasTuner>
tabVersion marked this conversation as resolved.
Show resolved Hide resolved
SMAC <Tuner/SmacTuner>
Metis Tuner <Tuner/MetisTuner>
Batch Tuner <Tuner/BatchTuner>
Expand Down
Binary file added docs/img/EvoNasTuner.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions src/sdk/pynni/nni/evo_nas_tuner/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from .evo_nas_tuner import EvoNasTuner
167 changes: 167 additions & 0 deletions src/sdk/pynni/nni/evo_nas_tuner/evo_nas_tuner.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,167 @@
import copy
chicm-ms marked this conversation as resolved.
Show resolved Hide resolved
import logging
import random
from collections import deque

import nni
from nni.tuner import Tuner
from nni import ClassArgsValidator
from nni.utils import OptimizeMode, extract_scalar_reward

logger = logging.getLogger(__name__)


class FinishedIndividual:
def __init__(self, parameter_id, parameters, result):
"""
Parameters
----------
parameter_id: int
the index of the parameter
parameters : dict
chosen architecture and parameters
result : float
final metric of the chosen one
"""
self.parameter_id = parameter_id
self.parameters = parameters
self.result = result


class EvolutionClassArgsValidator(ClassArgsValidator):
def validate_class_args(self, **kwargs):
Schema({
'optimize_mode': self.choices('optimize_mode', 'maximize', 'minimize'),
Optional('population_size'): self.range('population_size', int, 0, 99999),
Optional('sample_size'): self.range('sample_size', int, 0, 9999),
}).validate(kwargs)


class EvoNasTuner(Tuner):
"""
EvoNasTuner is tuner using Evolution NAS Tuner.
See ``Regularized Evolution for Image Classifier Architecture Search`` for details.

Parameters
---
optimize_mode: str
whether to maximize metric or not. default: 'maximize'
population_size: int
the maximum number of keeping models
sample_size: int
the number of models chosen from population each time when evolution
"""
def __init__(self, optimize_mode="maximize", population_size=100, sample_size=25):
super(EvoNasTuner, self).__init__()
self.optimize_mode = OptimizeMode(optimize_mode)
self.population_size = population_size
self.sample_size = sample_size
self.initial_population = deque()
self.population = deque()
self.history = {}
self.search_space = None

def generate_parameters(self, parameter_id, **kwargs):
"""
This function will returns a dict of trial (hyper-)parameters, as a serializable object.

Parameters
---
parameter_id: int
the index of current set of parameters
"""
if self.initial_population:
arch = self.initial_population.popleft()
self.history[parameter_id] = arch
return arch
elif self.population:
sample = []
while len(sample) < self.sample_size:
sample.append(random.choice(list(self.population)))

candidate = max(sample, key=lambda x: x.result)
arch = self._mutate_model(candidate)
self.history[parameter_id] = arch
return arch
else:
raise nni.NoMoreTrialError

def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
"""
Record the result from a trial

Parameters
----------
parameter_id : int
parameters : dict
value : dict/float
if value is dict, it should have "default" key.
value is final metrics of the trial.
"""
reward = extract_scalar_reward(value)
if parameter_id not in self.history:
raise RuntimeError('Received parameter_id not in total_data.')
params = self.history[parameter_id]

if self.optimize_mode == OptimizeMode.Minimize:
reward = -reward

self.population.append(FinishedIndividual(parameter_id, params, reward))
if len(self.population) > self.population_size:
self.population.popleft()

def update_search_space(self, search_space):
"""
Update search space.
Search_space contains the information that user pre-defined.

Parameters
----------
search_space : dict
"""
logger.info('update search space %s', search_space)
assert self.search_space is None
self.search_space = search_space

for _, val in search_space.items():
if val['_type'] != 'layer_choice' and val['_type'] != 'input_choice':
raise ValueError('Unsupported search space type: %s' % (val['_type']))

self._generate_initial_population()

def _random_model(self):
individual = {}
for key, val in self.search_space.items():
if val['_type'] == 'layer_choice':
idx = random.randint(0, len(val['_value']) - 1)
individual[key] = {'_value': val['_value'][idx], '_idx': idx}
elif val['_type'] == 'input_choice':
candidates = val['_value']['candidates']
n_chosen = val['_value']['n_chosen']
if n_chosen == None:
raise RuntimeError('Key n_chosen must be set in InputChoice.')
idxs = [random.randint(0, len(candidates) - 1) for _ in range(n_chosen)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could generate duplicate models.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if all models are evaluated? Are random models generated or throwing a NO_MORE_TRIAL exception?

vals = [candidates[k] for k in idxs]
individual[key] = {'_value': vals, '_idx': idxs}
return individual

def _mutate_model(self, model):
new_individual = copy.deepcopy(model.parameters)
mutate_key = random.choice(new_individual.keys())
mutate_val = self.search_space[mutate_key]
if mutate_val['_type'] == 'layer_choice':
idx = random.randint(0, len(mutate_val['_value']) - 1)
new_individual[mutate_key] = {'_value': mutate_val['_value'][idx], '_idx': idx}
elif mutate_val['_type'] == 'input_choice':
candidates = mutate_val['_value']['candidates']
n_chosen = mutate_val['_value']['n_chosen']
idxs = [random.randint(0, len(candidates) - 1) for _ in range(n_chosen)]
vals = [candidates[k] for k in idxs]
new_individual[mutate_key] = {'_value': vals, '_idx': idxs}
else:
raise KeyError
return new_individual

def _generate_initial_population(self):
while len(self.population) < self.population_size:
self.initial_population.append(self._random_model())