-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Load back saved parameters with save_model to Booster object #2613
Comments
Closed in favor of being in #2302. We decided to keep all feature requests in one place. Welcome to contribute this feature! Please re-open this issue (or post a comment if you are not a topic starter) if you are actively working on implementing this feature. |
is there any update on this issue? |
@zyxue , thanks for your interest in LightGBM! If you're interested in working on this feature and contributing, let us know and we'd be happy to answer questions you have. Otherwise, you can subscribe to notifications on this issue for updates. |
hey @jameslamb , I'm interested in giving it a try. Do you have guidance on where to start? |
Thanks @zyxue ! I'd start by reading the issues @StrikerRUS mentioned at #2613 (comment), just to get a better understanding of this part of the code base Next, I'd add a test to https://github.com/microsoft/LightGBM/blob/da98f24711a2faab17f94e5b2a636e6609c93fa6/tests/python_package_test/test_basic.py using the reproducible example provided by @everdark. That test should fail until your changes are made. Next, try to work through changes on the C++ side based on @StrikerRUS's statement #2613 (comment).
Here's the relevant Python code that's called to create a LightGBM/python-package/lightgbm/basic.py Lines 2635 to 2648 in da98f24
I believe you'll need to create a proposal for extracting the Line 459 in da98f24
"Config" is the word we use in LightGBM's C++ code to refer to an object that holds all parameters (see e.g. #4724 (review)). Here's code called by LightGBM/src/boosting/gbdt_model_text.cpp Lines 571 to 596 in d517ba1
I'll re-open this issue for now since you're planning to work on it. We have a policy in this repo of keeping feature request issues marked "closed" if no one is working on them, so if for any reason you decide not to work on this feature for now, please let me know so we can re-close it. And if you are interested in contributing but feel that this feature is not right for you, now that you know more about it, let me know what you're looking to work on and I'd be happy to suggest another one. Thanks again for your help! |
Thank you @jameslamb for the informative guide! I'll try to get to it. |
|
Hey @jameslamb , do you have any feedback on my PR above, please? I wonder if that's the right direction for loading back saved the params? |
thanks for starting on the work @zyxue ! We will get to reviewing it as soon as possible. I and a few other maintainers here work on LightGBM in our spare time, so we can sometimes be slow to respond (especially for larger features like this one which require more effort to review). Thanks for your patience. |
This comment was marked as off-topic.
This comment was marked as off-topic.
This was locked accidentally. I just unlocked it. We'd still welcome contributions related to this feature! |
Environment info
Operating System:
Windows 10 (Same result on both Windows and WSL)
CPU/GPU model:
Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz
C++/Python/R version:
Python 3.7
LightGBM version or commit hash:
2.3.1 installed by pip
Error message
Reproducible examples
Coding example above is directly borrowed from official example
advanced_example.py
I've confirmed the parameters have been written to model file.
Here is the trailing of the file:
Is this behavior by design?
I found this because I'm using
shap
with saved model and it failed to compute shap values due to the fact thatshap
need to accessobjective
in the params, which is gone if the Booster is a pre-trained and re-loaded one.As of now my workaround is to also pass params to Booster when loading:
However I don't think this is a good practice since there is no way to make sure the passed params are consistent with the saved model.
The text was updated successfully, but these errors were encountered: