Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Dev doc fix4 #672

Merged
merged 7 commits into from
Jan 28, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/AnnotationSpec.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ In NNI, there are mainly four types of annotation:

**Arguments**

- **sampling_algo**: Sampling algorithm that specifies a search space. User should replace it with a built-in NNI sampling function whose name consists of an `nni.` identification and a search space type specified in [SearchSpaceSpec](SearchSpaceSpec.md) such as `choice` or `uninform`.
- **sampling_algo**: Sampling algorithm that specifies a search space. User should replace it with a built-in NNI sampling function whose name consists of an `nni.` identification and a search space type specified in [SearchSpaceSpec](SearchSpaceSpec.md) such as `choice` or `uniform`.
- **name**: The name of the variable that the selected value will be assigned to. Note that this argument should be the same as the left value of the following assignment statement.

An example here is:
Expand All @@ -47,8 +47,8 @@ learning_rate = 0.1

**Arguments**

- **\*functions**: Several functions that are waiting to be selected from. Note that it should be a complete function call with aruguments. Such as `max_pool(hidden_layer, pool_size)`.
- **name**: The name of the function that will be replace in the following assignment statement.
- **\*functions**: Several functions that are waiting to be selected from. Note that it should be a complete function call with arguments. Such as `max_pool(hidden_layer, pool_size)`.
- **name**: The name of the function that will be replaced in the following assignment statement.

An example here is:

Expand All @@ -61,10 +61,10 @@ h_pooling = max_pool(hidden_layer, pool_size)

`'''@nni.report_intermediate_result(metrics)'''`

`@nni.report_intermediate_result` is used to report itermediate result, whose usage is the same as `nni.report_intermediate_result` in [Trials.md](Trials.md)
`@nni.report_intermediate_result` is used to report intermediate result, whose usage is the same as `nni.report_intermediate_result` in [Trials.md](Trials.md)

### 4. Annotate final result

`'''@nni.report_final_result(metrics)'''`

`@nni.report_final_result` is used to report final result of the current trial, whose usage is the same as `nni.report_final_result` in [Trials.md](Trials.md)
`@nni.report_final_result` is used to report the final result of the current trial, whose usage is the same as `nni.report_final_result` in [Trials.md](Trials.md)
6 changes: 3 additions & 3 deletions docs/Builtin_Assessors.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Builtin Assessors
# Built-in Assessors

NNI provides the-state-of-art tuning algorithm in our builtin-assessors and makes them easy to use. Below is the brief overview of NNI current builtin Assessors:
NNI provides state-of-the-art tuning algorithm in our builtin-assessors and makes them easy to use. Below is the brief overview of NNI current builtin Assessors:

|Assessor|Brief Introduction of Algorithm|
|---|---|
Expand All @@ -11,7 +11,7 @@ NNI provides the-state-of-art tuning algorithm in our builtin-assessors and make

## Usage of Builtin Assessors

Use builtin assessors provided by NNI sdk requires to declare the **builtinAssessorName** and **classArgs** in `config.yml` file. In this part, we will introduce the detailed usage about the suggested scenarios, classArg requirements, and example for each assessor.
Use builtin assessors provided by NNI SDK requires to declare the **builtinAssessorName** and **classArgs** in `config.yml` file. In this part, we will introduce the detailed usage about the suggested scenarios, classArg requirements, and example for each assessor.

Note: Please follow the format when you write your `config.yml` file.

Expand Down
10 changes: 5 additions & 5 deletions docs/Builtin_Tuner.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# Builtin Tuners
# Built-in Tuners

NNI provides the-state-of-art tuning algorithm in our builtin-tuners and makes them easy to use. Below is the brief overview of NNI current builtin Tuners:
NNI provides state-of-the-art tuning algorithm in our builtin-tuners and makes them easy to use. Below is the brief overview of NNI current builtin Tuners:

|Tuner|Brief Introduction of Algorithm|
|---|---|
|**TPE**<br>[(Usage)](#TPE)|The Tree-structured Parzen Estimator (TPE) is a sequential model-based optimization (SMBO) approach. SMBO methods sequentially construct models to approximate the performance of hyperparameters based on historical measurements, and then subsequently choose new hyperparameters to test based on this model.|
|**Random Search**<br>[(Usage)](#Random)|In Random Search for Hyper-Parameter Optimization show that Random Search might be surprisingly simple and effective. We suggest that we could use Random Search as the baseline when we have no knowledge about the prior distribution of hyper-parameters.|
|**Anneal**<br>[(Usage)](#Anneal)|This simple annealing algorithm begins by sampling from the prior, but tends over time to sample from points closer and closer to the best ones observed. This algorithm is a simple variation on the random search that leverages smoothness in the response surface. The annealing rate is not adaptive.|
|**Naive Evolution**<br>[(Usage)](#Evolution)|Naive Evolution comes from Large-Scale Evolution of Image Classifiers. It randomly initializes a population-based on search space. For each generation, it chooses better ones and does some mutation (e.g., change a hyperparameter, add/remove one layer) on them to get the next generation. Naive Evolution requires many trials to works, but it's very simple and easy to expand new features.|
|**SMAC**<br>[(Usage)](#SMAC)|SMAC is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by nni is a wrapper on the SMAC3 github repo.|
|**SMAC**<br>[(Usage)](#SMAC)|SMAC is based on Sequential Model-Based Optimization (SMBO). It adapts the most prominent previously used model class (Gaussian stochastic process models) and introduces the model class of random forests to SMBO, in order to handle categorical parameters. The SMAC supported by nni is a wrapper on the SMAC3 Github repo.|
|**Batch tuner**<br>[(Usage)](#Batch)|Batch tuner allows users to simply provide several configurations (i.e., choices of hyper-parameters) for their trial code. After finishing all the configurations, the experiment is done. Batch tuner only supports the type choice in search space spec.|
|**Grid Search**<br>[(Usage)](#GridSearch)|Grid Search performs an exhaustive searching through a manually specified subset of the hyperparameter space defined in the searchspace file. Note that the only acceptable types of search space are choice, quniform, qloguniform. The number q in quniform and qloguniform has special meaning (different from the spec in search space spec). It means the number of values that will be sampled evenly from the range low and high.|
|[Hyperband](https://github.com/Microsoft/nni/tree/master/src/sdk/pynni/nni/hyperband_advisor)<br>[(Usage)](#Hyperband)|Hyperband tries to use the limited resource to explore as many configurations as possible, and finds out the promising ones to get the final result. The basic idea is generating many configurations and to run them for the small number of STEPs to find out promising one, then further training those promising ones to select several more promising one.|
Expand All @@ -19,7 +19,7 @@ NNI provides the-state-of-art tuning algorithm in our builtin-tuners and makes t

## Usage of Builtin Tuners

Use builtin tuner provided by NNI sdk requires to declare the **builtinTunerName** and **classArgs** in `config.yml` file. In this part, we will introduce the detailed usage about the suggested scenarios, classArg requirments and example for each tuner.
Use builtin tuner provided by NNI SDK requires to declare the **builtinTunerName** and **classArgs** in `config.yml` file. In this part, we will introduce the detailed usage about the suggested scenarios, classArg requirements and example for each tuner.

Note: Please follow the format when you write your `config.yml` file.

Expand Down Expand Up @@ -109,7 +109,7 @@ tuner:

**Suggested scenario**

Its requirement of computation resource is relatively high. Specifically, it requires large inital population to avoid falling into local optimum. If your trial is short or leverages assessor, this tuner is a good choice. And, it is more suggested when your trial code supports weight transfer, that is, the trial could inherit the converged weights from its parent(s). This can greatly speed up the training progress.
Its requirement of computation resource is relatively high. Specifically, it requires large initial population to avoid falling into local optimum. If your trial is short or leverages assessor, this tuner is a good choice. And, it is more suggested when your trial code supports weight transfer, that is, the trial could inherit the converged weights from its parent(s). This can greatly speed up the training progress.

**Requirement of classArg**

Expand Down
8 changes: 4 additions & 4 deletions docs/Customize_Assessor.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@

## Customize Assessor

NNI also support building a assessor by yourself to adjust your tuning demand.
NNI also support building an assessor by yourself to adjust your tuning demand.

If you want to implement a customized Assessor, there are three things for you to do:

1) Inherit a assessor of a base Assessor class
1) Inherit an assessor of a base Assessor class
2) Implement assess_trial function
3) Write a script to run Assessor

Expand Down Expand Up @@ -57,9 +57,9 @@ def main():
main()
```

Please noted in **2**. The object `trial_history` are exact the object that Trial send to Assesor by using SDK `report_intermediate_result` function.
Please noted in **2**. The object `trial_history` are exact the object that Trial send to Assessor by using SDK `report_intermediate_result` function.

Also, user could override the `run` function in Assessor to control the process logic.
Also, user could override the `run` function in Assessor to control the processing logic.

More detail example you could see:
> * [medianstop-assessor](../src/sdk/pynni/nni/medianstop_assessor)
Expand Down
2 changes: 1 addition & 1 deletion docs/Customize_Tuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Customize Tuner

NNI provides the-state-of-art tuning algorithm in our builtin-tuners. But we also support building a tuner by yourself to adjust your tuning demand.
NNI provides state-of-the-art tuning algorithm in our builtin-tuners. We also support building a tuner by yourself to adjust your tuning demand.

If you want to implement and use your own tuning algorithm, you can implement a customized Tuner, there are three things for you to do:

Expand Down
Loading