- HPT xgboost always achieve the best performance. However, for the small dataset (i.e., camel 1.2), it takes a longer time to achieve 99% close-to-the-highest performance.
- AML and HPT can both achieve 95% close-to-the-highest AUC_weighted within 13 iterations, So we don’t have to hyper-parameter tune/deploy AML for a lot of iterations.
- Possible advice for the practitioner would be to run the AML for 20 iterations and Hyperparameter tune the suggested model for the best performance.
- The interpretation varies between the models tuned by AML and HPT. Therefore, we recommend using the interpretation of the model with the best fit.
-
Notifications
You must be signed in to change notification settings - Fork 0
yiikou/AML_vs_HPT
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Automl vs. hyperparameter tuning
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published