-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Number of estimated parameters in a tree ? #5408
Comments
I'm not qualified to comment on how you might calculate measures like AIC, intended for parametric models, on non-parametric models like the tree-based ones produced by LightGBM. Maybe others will have a comment on that. But I can directly answer the question "is [information about all the splits for all trees] available in LightGBM?". Yes, it is.
There are also other options to examine the structure of a fitted LightGBM model, like |
I've found an authoritative answer to the technical question on the Stats SE that extend to the way to evaluate a model. The issue seems to be closable. |
Thanks for coming back to share this and close this issue! |
This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this. |
For comparaison and notably (using AIC/BIC information criterion) I was wondering if there is a natural equivalent of number of trainable parameters for trees. Intuitively I would consider one parameter at each split (the level of the split) + one parameter for each final leaf (the value to predict), summed over all trees.
Is my calculation correct ? Is this info directly accessible in the library ?
The text was updated successfully, but these errors were encountered: