You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think the docstring is correct and the call to on_load_checkpoint should be moved right beforeload_state_dict to give the model a chance to call setup.
The text was updated successfully, but these errors were encountered:
I need this code change as well! I'm doing transfer learning and I want to support both loading the original model with the original weights, and modify it for a new task. on_load_checkpoint would allow me to redo the modifications I've done for transfer learning, so the state_dict of the transferred model can be properly restored.
At present I need to add non-network code to the model to handle this logic, which is ugly and prone to bugs.
This would allow to have the same model to redo the modifications I've made for transfer learning,
🐛 Bug
The docstring of
on_load_checkpoint
hook says that it is called before trying toload_state_dict
:https://github.com/PyTorchLightning/pytorch-lightning/blob/cea5f1f53876399dfaa0d37accdc527af7ca39af/pytorch_lightning/core/saving.py#L203-L206
However, in
LightningModule.load_from_checkpoint
, it is called afterload_state_dict
:https://github.com/PyTorchLightning/pytorch-lightning/blob/cea5f1f53876399dfaa0d37accdc527af7ca39af/pytorch_lightning/core/saving.py#L195-L199
Additional context
Related discussion on Slack: https://pytorch-lightning.slack.com/archives/CQXV8BRH9/p1602168345184000
I think the docstring is correct and the call to
on_load_checkpoint
should be moved right beforeload_state_dict
to give the model a chance to callsetup
.The text was updated successfully, but these errors were encountered: