Skip to content

Support Lightning Logging without Trainer #8509

@tchaton

Description

@tchaton

🚀 Feature

Motivation

To ease conversion from pure PyTorch to Lightning, users might start by creating their LightningModule.

However, their code would break if they try to log as the trainer isn't available.

Currently, we have 2 options:

  • make self.log in the absence of a Trainer
  • add support for logging without a trainer.

Here is a pseudo code to explain how we could support it.

The ResultCollection object is pretty self contained and is used to store logged values.

class LightningModule:
    def __init__(self):
        self._lightning_results = ResultCollection
        self.training_step = self._training_step_wrapper(self.training_step)

    @property
    def _results(self):
        if getattr(self, "trainer", None) is not None:
            return self.trainer._results
        return self._lightning_results

    def _training_step_wrapper(self, training_step_fn):
        def wrapper(self, *args, **kwargs)
            self._current_fx = "training_step"
            output = training_step_fn(self, *args, **kwargs)
            self._current_fx = None
        return wrapper

    def training_step():
        self.log(...)

class Model(LightningModule):
    ...

model = Model()

for _ in range(epochs):
    for batch in datalaoder:
        loss = model.training_step(batch, batch_idx)
        ...
        logged_metrics = model.get_logged_metrics()

reduced_metrics = model.get_callback_metrics(epoch=True)

Drawback, every LightningModule hooks used for logging should be wrapped to set the _current_fx function.

Pitch

Alternatives

Additional context

If you enjoy PL, check out our other projects:

  • Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
  • Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning
  • Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
  • Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.

cc @Borda @tchaton @justusschock @awaelchli @rohitgr7 @akihironitta @carmocca @edward-io @ananthsub @kamil-kaczmarek @Raalsky @Blaizzy

Metadata

Metadata

Assignees

No one assigned

    Labels

    designIncludes a design discussiondiscussionIn a discussion stagefeatureIs an improvement or enhancementloggingRelated to the `LoggerConnector` and `log()`priority: 2Low priority task

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions