-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bugs in Metrics #6731
Comments
I wanted to use metrics from XGBoost for Pytorch. But now only sklearn.metrics for all models! |
I need to take a closer look. |
The bug in gamma deviance is fixed. Better documentation for other metrics will be a different topic. Thanks for raising the issue! |
Just a quick note for everyone who has been following this thread. I believe these metrics and objectives are derived from the generalized linear model. |
All metrics are calculated for each result separately. Loss is calculated for each individual result, but metrics must be calculated for the entire result matrix. Therefore, the metrics in XGBoost are approximate. |
See |
The weird logloss you see is just a way to work around numerical issues. |
gamma-nloglik:
c == 0
logloss:
std::log(1.0f - eps) == std::log(1.0) == 0
gamma-deviance needs to be removed because the formula is not correct!(#6728)
I don't understand what math formula was used in poisson-nloglik.
-log( Poisson_regression )
With tweedie-nloglik it is also unclear. And the test is missing.
Tests for regression metrics with weights.(#6729)
If metrics are used in forest creation.....
The text was updated successfully, but these errors were encountered: