-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Conversation
@PurityFan and @leelaylay , please also help review, thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The long description of properties may not catch user's eyes at first and my suggestion is to cut it from five words to three.
docs/Overview.md
Outdated
@@ -1,49 +1,54 @@ | |||
# NNI Overview | |||
|
|||
NNI (Neural Network Intelligence) is a toolkit to help users run automated machine learning experiments. For each experiment, user only need to define a search space and update a few lines of code, and then leverage NNI build-in algorithms and training services to search the best hyper parameters and/or neural architecture. | |||
NNI (Neural Network Intelligence) is a toolkit to help users design and tune machine learning models (e.g., hyperparameters), neural network architectures, or complex system's parameters, in an efficient and automatic way. NNI has several appealing properties: easy-to-use, scalability, flexibility, extensibility, efficiency. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These word scalability
, flexibility
and extensibility
may express the similar meaning, so I suggest just to keep the word scalability
.
- NNI has several appealing properties: easy-to-use, scalability, flexibility, extensibility, efficiency.
+ NNI has several appealing features: easy-to-use, scalability and efficiency.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
reasonable, i will remove extensibility.
docs/Overview.md
Outdated
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments. | ||
* **Scalability**: Tuning hyperparameters or neural architecture often demands large amount of computation resource, while NNI is designed to fully leverage different computation resources, such as remote machines, training platforms (e.g., PAI, Kubernetes). NNI could run thousands of parallel trials depends on the capacity of your configured training platforms. | ||
* **Flexibility**: Besides rich built-in algorithms, NNI allows users to customize various hyperparameter tuning algorithms, neural architecture search algorithms, early stopping algorithms, etc. | ||
* **Extensibility**: Users could extend NNI with more training platforms, running trials on those computation resources. For example, connecting NNI with virtual machines, kubernetes service, etc. on the cloud, such as Azure, AWS, Aliyun. Users could also connect NNI to external environments to tune the applications/models on them. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For Easy-to-use
, I think we may improve the convenience of NNI later because it seems that the specific python version and node version is required. Shall NNI provide conda package later (my suggestion)?
For Scalability
, Flexibility
, and Extensibility
, it seems that Scalability
and Extensibility
sections convey the platform scalability and Flexibility
section conveys the algorithm scalability.
One more thing, we can mention that the docker environment is provided which is an appealing feature.
docs/Overview.md
Outdated
|
||
>Step 1: [Define search space](SearchSpaceSpec.md) | ||
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments. | ||
* **Scalability**: Tuning hyperparameters or neural architecture often demands large amount of computation resource, while NNI is designed to fully leverage different computation resources, such as remote machines, training platforms (e.g., PAI, Kubernetes). NNI could run thousands of parallel trials depends on the capacity of your configured training platforms. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe "NNI could run thousands of parallel trials by depending on the capacity of ..."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed, thanks.
|
||
>Step 3: [Define Experiment](ExperimentConfig.md) | ||
* *Experiment*: An experiment is one task of, for example, finding out the best hyperparameters of a model, finding out the best neural network architecture. It consists of trials and AutoML algorithms. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why use italics instead of bold, I think bolder is more conspicuous? (just a suggestion)
|
||
NNI provides a set of examples in the package to get you familiar with the above process. In the following example [/examples/trials/mnist], we had already set up the configuration and updated the training codes for you. You can directly run the following command to start an experiment. | ||
Basically, an experiment runs as follows: Tuner receives search space and generates configurations. These configurations will be submitted to training platforms, such as local machine, remote machines, or training clusters. Their performances are reported back to Tuner. Then, new configurations are generated and submitted. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider adding a training platform explanation to the key concepts instead of the “such as” example?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quick question about "SearchSpace", do we want to use them as one term or 2 words?
Agree with PurityFan's on Training Platform's definition. Also, I was thinking Configuration also includes the yalm file's definition for training platform.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @PurityFan and @scarlett2018 , will add training platform. Let's keep 'Search Space' for now, then discuss it when the whole doc is almost ready.
Here 'Configuration' does not mean experiment's config file. It means 'It means the feasible region for tuning the model. For example, the value range of each hyperparameters.'. @PurityFan is this confusing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can understand what you mean. Need to distinguish between config.yml and configuration.
docs/Overview.md
Outdated
|
||
>Step 1: [Define search space](SearchSpaceSpec.md) | ||
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are only several lines to be
added to your code ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As the value proposition for NNI had extended to more than just model tuning (in the 1st section), you might also not want to only emphasis NNI's power with "tune your model".
both "a", remove the "a". "view and control" -> "work with".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@SparkSnail fixed, thanks.
@scarlett2018 you are right, how about directly remove 'to tune your model'?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
approved with comments.
docs/Overview.md
Outdated
|
||
>Step 1: [Define search space](SearchSpaceSpec.md) | ||
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As the value proposition for NNI had extended to more than just model tuning (in the 1st section), you might also not want to only emphasis NNI's power with "tune your model".
both "a", remove the "a". "view and control" -> "work with".
|
||
NNI provides a set of examples in the package to get you familiar with the above process. In the following example [/examples/trials/mnist], we had already set up the configuration and updated the training codes for you. You can directly run the following command to start an experiment. | ||
Basically, an experiment runs as follows: Tuner receives search space and generates configurations. These configurations will be submitted to training platforms, such as local machine, remote machines, or training clusters. Their performances are reported back to Tuner. Then, new configurations are generated and submitted. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quick question about "SearchSpace", do we want to use them as one term or 2 words?
Agree with PurityFan's on Training Platform's definition. Also, I was thinking Configuration also includes the yalm file's definition for training platform.
|
||
## Key Concepts | ||
For each experiment, user only needs to define a search space and update a few lines of code, and then leverage NNI built-in Tuner/Assessor and training platforms to search the best hyper parameters and/or neural architecture. There are basically 3 steps: | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hyperparameters
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed, thanks.
|
||
After user submits the experiment through a command line tool [nnictl](../tools/README.md), a demon process (NNI manager) take care of search process. NNI manager continuously get search settings that generated by tuning algorithms, then NNI manager asks the training service component to dispatch and run trial jobs in a targeted training environment (e.g. local machine, remote servers and cloud). The results of trials jobs such as model accurate will send back to tuning algorithms for generating more meaningful search settings. NNI manager stops the search process after it find the best models. | ||
* *Configuration*: A configuration is an instance from the search space, that is, each hyperparameter has a specific value. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think config.yml is a configuration?
No description provided.