-
Notifications
You must be signed in to change notification settings - Fork 505
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hyperparameter search space for Catboost? #144
Comments
We have not tried those. Would you like to explore a different search space? |
A different search space might yield better results for my current project, yes. I have noticed that the best loss for catboost is always worse than xgboost and lgbm. I was wondering if there was a particular reason that catboost's search space is smaller, but it sounds like there is not. So I will experiment with a different/larger search space, and if I learn anything interesting, I will report back here FYI. |
If you are interested in checking Catboost hyperparameters here are most of them. `params = {"iterations": 100, # Default 1000 if decreased learning_rate should be increased. While tuning iteration:hight, learning_rate:low
But I would not recommend to tune all of them. Because according to my experience tuning all the parameters will not yield good results. But I would recommend going to this catboost's official tutorial link where they give more information about feature generating hyperparameters that improve the accuracy. This was my first contrabution to the open source comunity. I hope you found it helpful. |
The search space for Catboost is rather limited; it only includes
early_stopping_rounds
andlearning_rate
:FLAML/flaml/model.py
Lines 620 to 633 in 072e9e4
Is there a reason why other hyperparameters are not searched? I was thinking it might be interesting to include:
l2_leaf_reg
subsample
bagging_temperature
mvs_reg
random_strength
max_leaves
fold_len_multiplier
model_shrink_rate
https://catboost.ai/docs/concepts/python-reference_parameters-list.html
The text was updated successfully, but these errors were encountered: