You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary: we need to provide a tool for the user so that he can run multiple training workers with different values of hyperparameters.
Below subtasks:
Develop the syntax for describing a strategy for hyperparameters variation (examples below are just concepts – you should design syntax by yourself and get approve from the team). We suggest to implement parent legion API resource that will include regular ModelTrainingTemplate resource with other parameters for scalable running.
I know about three strategies for hyperparameters searching:
1.1. Manual – user hardcode fixed values for hyperparameters, review results, and modify them for the next run (Legion already allows do this).
Example
1.2. Random search in space. Imagine that we have X hyperparameters, therefore if we want to run training with bruteforce all possible combinations for these parameters with some values, we will have N^X required runs. To decrease the complexity of searching in such a big space usually, a random selection of values in ranges is used
Example:
1.3. Varying one hyperparameter in range, while others are fixed One of the hyperparameters is described with range syntax, others are hardcoded. System guarantee that all values for variated hyperparameter will be runned
Example:
Summary: we need to provide a tool for the user so that he can run multiple training workers with different values of hyperparameters.
Below subtasks:
parent
legion API resource that will include regular ModelTrainingTemplate resource with other parameters for scalable running.I know about three strategies for hyperparameters searching:
1.1. Manual – user hardcode fixed values for hyperparameters, review results, and modify them for the next run (Legion already allows do this).
Example
1.2. Random search in space. Imagine that we have X hyperparameters, therefore if we want to run training with bruteforce all possible combinations for these parameters with some values, we will have N^X required runs. To decrease the complexity of searching in such a big space usually, a random selection of values in ranges is used
Example:
1.3. Varying one hyperparameter in range, while others are fixed One of the hyperparameters is described with range syntax, others are hardcoded. System guarantee that all values for variated hyperparameter will be runned
Example:
Example:
Theory: https://en.wikipedia.org/wiki/Hyperparameter_optimization
The text was updated successfully, but these errors were encountered: