-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add useful errors when model is not configured correctly #1199
Add useful errors when model is not configured correctly #1199
Conversation
Hello @SkafteNicki! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2020-04-02 14:38:44 UTC |
This pull request is now in conflict... :( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like VERY MUCH this warning check grouping, just pls have look ad my formatting proposals
(I made suggestion just to the first one but it allays to all :])
@neggert I guess you already were solving the DDP pickle issue, right?
@SkafteNicki may you pls resolve conflist in chnagelog? |
@Borda I update code with your recommendations and have solved the conflict. Only the pickle issues seems to be blocking a merge. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is great @SkafteNicki , thanks for your work on this! can you add tests to ensure that the exceptions are raised when the model is misconfigured?
@jeremyjordan I have added the test that you requested |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good :) - minor weirdness in the CHANGELOG (see comment)
@SkafteNicki @Borda @PyTorchLightning/core-contributors Wondering if the required methods should be made proper abstract methods in LightningModule
so that IDEs will prompt to implement them?
some related work going on in #1279 |
This pull request is now in conflict... :( |
This pull request is now in conflict... :( |
Awesome! |
@SkafteNicki nice job, could you pls resolve conflicts... |
@Borda conflicts are solved now, some (unrelated) tests are still failing... |
@SkafteNicki I have restarted the GitHub, but the GPU is failing because of
I thought we were fixing it... @neggert? |
This pull request is now in conflict... :( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, apparently we duplicated with #1317
We need to still allow a LightningModule to function like a regular PyTorch module. So, instead of raising errors, we need to give warnings.
Want to merge my changes into this PR instead?
The main other thing in my PR is that configure_optimizers no longer returns a default Adam.
could we merge this one and then Will's as followup? |
As the checks implemented in this PR is only called within If you want to merge #1317 into this, fine by me. I will update |
This pull request is now in conflict... :( |
@SkafteNicki awesome! |
@williamFalcon @SkafteNicki there is still a bug and now it is in master
http://35.192.60.23/PyTorchLightning/pytorch-lightning/954/1/2 |
probably that code call |
Yeah self.__code__ = self.__call__.__code__ Seems that code object isn't pickleable. We need everything to be pickleable for DDP to work. |
…I#1199) * add check_model_configuration method * trying to fix errors * trying to fix tests * added test_epoch_end to lightning template * fix tests * fix new test after rebase * fix spelling * added more checks * updated formating * added tests * fixed CHANGELOG * Apply suggestions from code review * move test to new module * change check on configure_optimizers Co-authored-by: Nicki Skafte <nugginea@gmail.com> Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
It appears that |
Before submitting
What does this PR do?
Fixes #1171 and #508
This PR adds checks, that secure that the users model is correctly configured before training. This includes:
training_step
has been overriddentraining_dataloader
has been overriddenconfigure_optimizers
has not been overridden, will tell user that program is using default optimizer (adam with lr=0.0001)validation_dataloader
is overridden but novalidation_step
is defined (and vise verse)test_dataloader
is overriden but notest_step
is defined (and vise verse)validation_dataloader, validation_step
is overridden but novalidation_epoch_end
test_dataloader, test_step
is overridden but notest_epoch_end
The most fundamental change is the requirement of
validation_step
andtest_step
when there respective dataloaders are defined. This will probably not be backward compatible with some users code.