-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid patching DataHooks
#10498
Comments
Sounds good to me. |
If both DataModule and LightningModule provide implementations for Should this be an error instead if both are overridden and used together? What if the DataModule has separate logic for |
A question about context. Do we know why these were added for both the And if there's one, why did we not do this as:
which would avoid the question of "what should happen when both are implemented" |
@carmocca only one of the data module or lightning module can actually move the batch to device, so Other than that I share your questions for what the context for these hooks and this mixin sharing was |
+1 for your question. @carmocca
situation 3 may not be reasonably addressed, I'm open for better ideas. However, in summary, all three situations do not need hook patching. |
yes, if dataloaders are present in the datamodule then we rely on the implementation provided under datamodule else LightningModule.
@carmocca @ananthsub I'd say a warning in such a case would be good enough and we can prioritize datamodule in such a case. The reason being, a user can import a
@ninginthecloud if we go with this then it would be hard to know in the second case whether to use datamodule or lightning module for these hooks. |
Hi, @rohitgr7 For the second situation, since the temporary datamodule does not override hooks like My initial idea is based on @carmocca suggested about executing hooks from dm and lightningmodule in order. How about
Here, datamodule could come from user input In this way, there are several pros: 1) avoid patching and de-patching hooks to lightningmodule 2) the way to process datamodule and dataloader could be consistent 3) for all the hooks, we know which one get executed. |
sounds good to me 😃 . I'll add the logic now in the attached PR, we can integrate a temp datamodule in another one maybe. |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team! |
Hi, @rohitgr7, I'd like to follow up on this issue. Are there any update? I'm wondering if there's anything I can do here. |
hey @ninginthecloud ! Regarding your proposal of creating a team datamdodule if dataloaders are passed explicitly, it can be configured easily with some additional changes once this issue gets fixed. |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team! |
Proposed refactoring or deprecation
We need to avoid patching some of these datahooks:
https://github.com/PyTorchLightning/pytorch-lightning/blob/09cf167237e867f1ec67a5db87e5a02c2cea4b69/pytorch_lightning/trainer/connectors/data_connector.py#L238-L242
We have already removed the patching for dataloader related hooks here: #9764
Motivation
It can fail in some cases. One example:
CODE
Prints:
Expected:
Pitch
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @rohitgr7 @ninginthecloud
The text was updated successfully, but these errors were encountered: