-
Notifications
You must be signed in to change notification settings - Fork 517
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixed loading preprocessing params from pretrained weights #1473
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great, just left 2 non-blocking comments, you can always ignore them.
About the tests, is it possible to set pretrained_weights
to be a local path ? I never tried and need to dig into the code to check, but maybe it would work ?
In that case we can just quickly train a model locally and use its checkpoint path
@Louis-Dupont I've added the test that saves a temporary checkpoint with preprocessing params and then ensures that predict() works as expected. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
* Fixed loading preprocessing params from pretrained weights * Added support for file:/// in pretrained weights * Added test to ensure we load preprocessing params (cherry picked from commit 96df027)
If pretrained weights checkpoint contains preprocessing params ensures that it is properly loaded.