Description
Proposed refactor
Move this logic to individual progress bar callback implementations:
https://github.com/PyTorchLightning/pytorch-lightning/blob/6369e3b77fa3f38613b661517f6361f842f611c9/pytorch_lightning/trainer/trainer.py#L1273-L1275
Motivation
-
Simplifies the trainer
-
Avoid duplication of this logic in between trainer & spawning plugins. For example, this logic is replicated in the TPU Spawn strategy: https://github.com/PyTorchLightning/pytorch-lightning/blob/6369e3b77fa3f38613b661517f6361f842f611c9/pytorch_lightning/plugins/training_type/tpu_spawn.py#L158-L159
-
Avoid duplication of this logic across different running stages:
- Training: listed above
- Evaluation: https://github.com/PyTorchLightning/pytorch-lightning/blob/6369e3b77fa3f38613b661517f6361f842f611c9/pytorch_lightning/trainer/trainer.py#L1287-L1289
- Prediction: ???
- Allows custom progress bars to collect information from all ranks.
We underwent a very similar refactor for loggers here to remove rank 0 restrictions:
#8589
#8608
Pitch
enable
and disable
can be internal implementations of the progress bar callback. This flags can be set anytime after the setup
hook runs.
def setup(self, trainer, pl_module, stage) -> None:
if not trainer.is_global_zero:
self.disable()
...
This way the trainer doesn't have to do any special checks for progress bars in the middle of the training control flow
Additional context
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
-
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
-
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @Borda @justusschock @awaelchli @akihironitta @SeanNaren @kaushikb11