Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

update doc: overview #555

Merged
merged 9 commits into from
Jan 21, 2019
Merged

Conversation

QuanluZhang
Copy link
Contributor

No description provided.

@QuanluZhang
Copy link
Contributor Author

@PurityFan and @leelaylay , please also help review, thanks!

Copy link
Contributor

@leelaylay leelaylay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The long description of properties may not catch user's eyes at first and my suggestion is to cut it from five words to three.

docs/Overview.md Outdated
@@ -1,49 +1,54 @@
# NNI Overview

NNI (Neural Network Intelligence) is a toolkit to help users run automated machine learning experiments. For each experiment, user only need to define a search space and update a few lines of code, and then leverage NNI build-in algorithms and training services to search the best hyper parameters and/or neural architecture.
NNI (Neural Network Intelligence) is a toolkit to help users design and tune machine learning models (e.g., hyperparameters), neural network architectures, or complex system's parameters, in an efficient and automatic way. NNI has several appealing properties: easy-to-use, scalability, flexibility, extensibility, efficiency.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These word scalability, flexibility and extensibility may express the similar meaning, so I suggest just to keep the word scalability.

- NNI has several appealing properties: easy-to-use, scalability, flexibility, extensibility, efficiency.
+ NNI has several appealing features: easy-to-use, scalability and efficiency.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reasonable, i will remove extensibility.

docs/Overview.md Outdated
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments.
* **Scalability**: Tuning hyperparameters or neural architecture often demands large amount of computation resource, while NNI is designed to fully leverage different computation resources, such as remote machines, training platforms (e.g., PAI, Kubernetes). NNI could run thousands of parallel trials depends on the capacity of your configured training platforms.
* **Flexibility**: Besides rich built-in algorithms, NNI allows users to customize various hyperparameter tuning algorithms, neural architecture search algorithms, early stopping algorithms, etc.
* **Extensibility**: Users could extend NNI with more training platforms, running trials on those computation resources. For example, connecting NNI with virtual machines, kubernetes service, etc. on the cloud, such as Azure, AWS, Aliyun. Users could also connect NNI to external environments to tune the applications/models on them.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For Easy-to-use, I think we may improve the convenience of NNI later because it seems that the specific python version and node version is required. Shall NNI provide conda package later (my suggestion)?
For Scalability, Flexibility, and Extensibility, it seems that Scalability and Extensibility sections convey the platform scalability and Flexibility section conveys the algorithm scalability.
One more thing, we can mention that the docker environment is provided which is an appealing feature.

docs/Overview.md Show resolved Hide resolved
docs/Overview.md Show resolved Hide resolved
docs/Overview.md Outdated

>Step 1: [Define search space](SearchSpaceSpec.md)
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments.
* **Scalability**: Tuning hyperparameters or neural architecture often demands large amount of computation resource, while NNI is designed to fully leverage different computation resources, such as remote machines, training platforms (e.g., PAI, Kubernetes). NNI could run thousands of parallel trials depends on the capacity of your configured training platforms.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe "NNI could run thousands of parallel trials by depending on the capacity of ..."

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed, thanks.


>Step 3: [Define Experiment](ExperimentConfig.md)
* *Experiment*: An experiment is one task of, for example, finding out the best hyperparameters of a model, finding out the best neural network architecture. It consists of trials and AutoML algorithms.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why use italics instead of bold, I think bolder is more conspicuous? (just a suggestion)


NNI provides a set of examples in the package to get you familiar with the above process. In the following example [/examples/trials/mnist], we had already set up the configuration and updated the training codes for you. You can directly run the following command to start an experiment.
Basically, an experiment runs as follows: Tuner receives search space and generates configurations. These configurations will be submitted to training platforms, such as local machine, remote machines, or training clusters. Their performances are reported back to Tuner. Then, new configurations are generated and submitted.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider adding a training platform explanation to the key concepts instead of the “such as” example?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quick question about "SearchSpace", do we want to use them as one term or 2 words?

Agree with PurityFan's on Training Platform's definition. Also, I was thinking Configuration also includes the yalm file's definition for training platform.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @PurityFan and @scarlett2018 , will add training platform. Let's keep 'Search Space' for now, then discuss it when the whole doc is almost ready.
Here 'Configuration' does not mean experiment's config file. It means 'It means the feasible region for tuning the model. For example, the value range of each hyperparameters.'. @PurityFan is this confusing?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can understand what you mean. Need to distinguish between config.yml and configuration.

docs/Overview.md Outdated

>Step 1: [Define search space](SearchSpaceSpec.md)
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are only several lines to be added to your code ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As the value proposition for NNI had extended to more than just model tuning (in the 1st section), you might also not want to only emphasis NNI's power with "tune your model".

both "a", remove the "a". "view and control" -> "work with".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SparkSnail fixed, thanks.
@scarlett2018 you are right, how about directly remove 'to tune your model'?

Copy link
Member

@scarlett2018 scarlett2018 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

approved with comments.

docs/Overview.md Outdated

>Step 1: [Define search space](SearchSpaceSpec.md)
* **Easy-to-use**: NNI can be easily installed through python pip. There are only several lines to added to your code in order to use NNI's power to tune your model. And you can use both a commandline tool and WebUI to view and control your experiments.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As the value proposition for NNI had extended to more than just model tuning (in the 1st section), you might also not want to only emphasis NNI's power with "tune your model".

both "a", remove the "a". "view and control" -> "work with".


NNI provides a set of examples in the package to get you familiar with the above process. In the following example [/examples/trials/mnist], we had already set up the configuration and updated the training codes for you. You can directly run the following command to start an experiment.
Basically, an experiment runs as follows: Tuner receives search space and generates configurations. These configurations will be submitted to training platforms, such as local machine, remote machines, or training clusters. Their performances are reported back to Tuner. Then, new configurations are generated and submitted.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quick question about "SearchSpace", do we want to use them as one term or 2 words?

Agree with PurityFan's on Training Platform's definition. Also, I was thinking Configuration also includes the yalm file's definition for training platform.


## Key Concepts
For each experiment, user only needs to define a search space and update a few lines of code, and then leverage NNI built-in Tuner/Assessor and training platforms to search the best hyper parameters and/or neural architecture. There are basically 3 steps:

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hyperparameters

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed, thanks.


After user submits the experiment through a command line tool [nnictl](../tools/README.md), a demon process (NNI manager) take care of search process. NNI manager continuously get search settings that generated by tuning algorithms, then NNI manager asks the training service component to dispatch and run trial jobs in a targeted training environment (e.g. local machine, remote servers and cloud). The results of trials jobs such as model accurate will send back to tuning algorithms for generating more meaningful search settings. NNI manager stops the search process after it find the best models.
* *Configuration*: A configuration is an instance from the search space, that is, each hyperparameter has a specific value.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think config.yml is a configuration?

@QuanluZhang QuanluZhang merged commit e7e87c5 into microsoft:dev-doc Jan 21, 2019
@QuanluZhang QuanluZhang deleted the update-doc9 branch February 20, 2020 13:57
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants