-
Notifications
You must be signed in to change notification settings - Fork 174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add test for xgboost modelbuiler #1359
Add test for xgboost modelbuiler #1359
Conversation
/intelci: run |
@razdoburdin, please, review this PR |
@@ -0,0 +1,119 @@ | |||
#=============================================================================== | |||
# Copyright 2014 Intel Corporation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Copyright 2014 Intel Corporation | |
# Copyright 2023 Intel Corporation |
@avolkov-intel - let's not use examples as mean for testing - this is foe sure doesn't worth separate example to be created |
@avolkov-intel please add unit test for that feature. Let's do it on separate PR |
@avolkov-intel please also update doc/daal4py/model-builders.rst for this example. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please address CI fails
|
||
# Training | ||
xgb_clf = xgb.XGBClassifier(**params) | ||
xgb_clf.fit(X_train, y_train, eval_set=[(X_test, y_test)]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we really want the validation set is printed to stdout at each boosting stage?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should provide eval_set to fit xgboost with early_stopping_rounds (this feature is used to avoid overfitting so dataset different from train should be provided)
I suppose I will just remove example and add test in this PR since we do not need to add new example as Nikolay said |
/intelci: run |
/intelci: run |
@razdoburdin, please review the PR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test itself is fine. The problem is the failure of the private-ci. XGBoost is not installed in private-ci environment, so we need to skip the test in this case somehow.
@razdoburdin @homksei - might be it worth to just add xgboost package internally? |
it will be the best option for us |
/intelci: run |
tests/test_xgboost_mb.py
Outdated
hasattr(d4p, 'gbt_classification_prediction'), | ||
daal_check_version(((2021, 'P', 1)))]), reason) | ||
@unittest.skipUnless(importlib.util.find_spec('xgboost') | ||
is not None, 'xgoost library is not installed') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is not None, 'xgoost library is not installed') | |
is not None, 'xgboost library is not installed') |
/intelci: run |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see that this tests are passed:
test_earlystop (test_xgboost_mb.XgboostModelBuilder) ... skipped 'xgboost library is not installed'
Please address this issue first
Sorry, I see that this test passed on public CI:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good to me.
Description
In this PR #1350 fix for xgboost modelbuilder was presented. The issue was about inference of model fitted with early_stopping_rounds parameter. In this PR I present the test that shows that there is no disrepancy between xgboost and xgboost modelbuilders.