Skip to content

Commit

Permalink
[docs] Add FLAML for efficient hyperparameter optimization (#4013)
Browse files Browse the repository at this point in the history
* add FLAML for HPO in DOC

* add FLAML for HPO

* revise FLAML phasing

* Update docs/Parameters-Tuning.rst

Co-authored-by: Nikita Titov <nekit94-08@mail.ru>

* Update README.md

Co-authored-by: Nikita Titov <nekit94-08@mail.ru>

Co-authored-by: Nikita Titov <nekit94-08@mail.ru>
  • Loading branch information
qingyun-wu and StrikerRUS committed Feb 24, 2021
1 parent 3ab6bbf commit 6356e65
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 1 deletion.
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,8 @@ Next you may want to read:
- [**Parameters**](https://github.com/microsoft/LightGBM/blob/master/docs/Parameters.rst) is an exhaustive list of customization you can make.
- [**Distributed Learning**](https://github.com/microsoft/LightGBM/blob/master/docs/Parallel-Learning-Guide.rst) and [**GPU Learning**](https://github.com/microsoft/LightGBM/blob/master/docs/GPU-Tutorial.rst) can speed up computation.
- [**Laurae++ interactive documentation**](https://sites.google.com/view/lauraepp/parameters) is a detailed guide for hyperparameters.
- [**Optuna Hyperparameter Tuner**](https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258) provides automated tuning for LightGBM hyperparameters ([code examples](https://github.com/optuna/optuna/blob/master/examples/)).
- [**FLAML**](https://www.microsoft.com/en-us/research/project/fast-and-lightweight-automl-for-large-scale-data/articles/flaml-a-fast-and-lightweight-automl-library/) provides automated tuning for LightGBM ([code examples](https://github.com/microsoft/FLAML/blob/main/notebook/flaml_lightgbm.ipynb)).
- [**Optuna Hyperparameter Tuner**](https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258) provides automated tuning for LightGBM hyperparameters ([code examples](https://github.com/optuna/optuna/tree/master/examples/lightgbm)).

Documentation for contributors:

Expand All @@ -59,6 +60,8 @@ Some old update logs are available at [Key Events](https://github.com/microsoft/
External (Unofficial) Repositories
----------------------------------

FLAML (AutoML library for hyperparameter optimization): https://github.com/microsoft/FLAML

Optuna (hyperparameter optimization framework): https://github.com/optuna/optuna

Julia-package: https://github.com/IQVIA-ML/LightGBM.jl
Expand Down
3 changes: 3 additions & 0 deletions docs/Parameters-Tuning.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ This page contains parameters tuning guides for different scenarios.

- `Parameters <./Parameters.rst>`__
- `Python API <./Python-API.rst>`__
- `FLAML`_ for automated hyperparameter tuning
- `Optuna`_ for automated hyperparameter tuning

Tune Parameters for the Leaf-wise (Best-first) Tree
Expand Down Expand Up @@ -214,3 +215,5 @@ Deal with Over-fitting
- Try increasing ``path_smooth``

.. _Optuna: https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258

.. _FLAML: https://github.com/microsoft/FLAML

0 comments on commit 6356e65

Please sign in to comment.