-
Notifications
You must be signed in to change notification settings - Fork 3.6k
drop unused variable in API #6308
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #6308 +/- ##
======================================
- Coverage 93% 93% -0%
======================================
Files 160 160
Lines 11399 11398 -1
======================================
- Hits 10640 10610 -30
- Misses 759 788 +29 |
| model.sequential_module = load_sequential_from_saved_layers(self.gpus_per_model) | ||
| save_model_fn(last_filepath, trainer) | ||
| model.sequential_module = current_layers |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure how much this makes sense since the model is not used...
cc: @tchaton
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it is used in save_model_fn as trainer will access lightning_module. So we re-create full model only temporary. But your changes should be fine.
|
The changes here to |
tchaton
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
akihironitta
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we remove pl_module from the following private methods, too?:
https://github.com/PyTorchLightning/pytorch-lightning/blob/424b71e145125a0c68ae1c56b3bd4c20d71cee85/pytorch_lightning/callbacks/model_checkpoint.py#L541
https://github.com/PyTorchLightning/pytorch-lightning/blob/424b71e145125a0c68ae1c56b3bd4c20d71cee85/pytorch_lightning/callbacks/model_checkpoint.py#L520
Also, there are still outdated comments # Todo: required argument pl_module is not used in the file model_checkpoint.py.
What does this PR do?
dropping some unused arguments from APIP which does not affect user
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃