You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now we have {strategy_name}+{parameter} registries, but not {strategy_name} for most strategies. For example “ddp_find_unused_parameters_false” is registries but not “ddp”. The accelerator_connector uses both strategy_registry and enums to find str to strategy matching.
part of #10416
Step 3: part of the #11449
Simplify accelerator_connector logic
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
four4fish
changed the title
Registry Strategy with strategy distributed backend/ strategy base name
Registry Strategy with strategy base name
Jan 20, 2022
Proposed refactor
Now we have {strategy_name}+{parameter} registries, but not {strategy_name} for most strategies. For example “ddp_find_unused_parameters_false” is registries but not “ddp”. The accelerator_connector uses both strategy_registry and enums to find str to strategy matching.
part of #10416
Motivation
Unify strategy registry behavior, remove enums and enable accelerator_connector logic simplification
Pitch
Step 1: define
distributed_backend
for every strategy. Like in DDPStrategyhttps://github.com/PyTorchLightning/pytorch-lightning/blob/f41d1e5e5ebb7040a39d137695e818cada9a9234/pytorch_lightning/strategies/ddp.py#L83
Step 2: Registry strategy with the strategy.distributed_backend
Step 3: part of the #11449
Simplify accelerator_connector logic
Additional context
If you enjoy Lightning, check out our other projects! ⚡
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
cc @justusschock @awaelchli @akihironitta @rohitgr7
The text was updated successfully, but these errors were encountered: