-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError with LightningModule forward without Trainer #9716
Comments
self.log_dict({"key": 0})
|
My use case is to re-use the same |
Previous answer on this: |
seems reasonable. just curious, if you are logging inside forward function then how are you differentiating the log keys since forward will be used for training/validating/testing I assume? |
Actually, calling Closing this as using
It will use the |
🐛 Bug
LightningModule with Lightning 1.4 assumes to always have the
self.trainer
notNone
. No issues with 1.3To Reproduce
Outputs:
Expected behavior
If the Trainer and/or logger are not defined, the
log
andlog_dict
calls should be only ignored.Environment
Additional context
The problem is caused by this line:
https://github.com/PyTorchLightning/pytorch-lightning/blob/ab069876cb19bb9de0179f74c6f83764876a73ff/pytorch_lightning/core/lightning.py#L398
Related PR: #7891
@carmocca
The text was updated successfully, but these errors were encountered: