self.log isn't logging anything when combining two pl.LightningModule into one main pl.LightningModule #10402
Labels
bug
Something isn't working
help wanted
Open to be worked on
logging
Related to the `LoggerConnector` and `log()`
priority: 1
Medium priority task
🐛 Bug
self.log does not log anything when
pl.LightningModule
consists of different subpl.LightningModule
To Reproduce
Sometimes we need to develop submodules (one loss output per submodules), and combine them into the final ensemble module to train everything end to end. Here's the pseudo code:
When training only modelA or modelB, the self.log could log the loss to tensorboard.
But after combining both of them into the final
Ensemble
model, nothing is logged to the tensorboard.Expected behavior
I would expect that when training
ensemble
, I could see bothtrain_loss_A
andtrain_loss_B
in the tensorboard.Environment
You can also fill out the list below manually.
-->
pytorch-lightning==1.5.0
torch==1.10.0
Additional context
I think it is related to this pull request
#9733
The text was updated successfully, but these errors were encountered: