RemoveAcceleratorConnector.parallel_device_ids
and deprecate Trainer.data_parallel_device_ids
#12051
Labels
AcceleratorConnector.parallel_device_ids
and deprecate Trainer.data_parallel_device_ids
#12051
Proposed refactor
Remove the property
AcceleratorConnector.parallel_device_ids
and migrate its usagehttps://github.com/PyTorchLightning/pytorch-lightning/blob/550d3a640d1bfeb731a16b2996d3160f0f7eb071/pytorch_lightning/trainer/connectors/accelerator_connector.py#L830-L832
Deprecate the property
Trainer.parallel_device_ids
https://github.com/PyTorchLightning/pytorch-lightning/blob/550d3a640d1bfeb731a16b2996d3160f0f7eb071/pytorch_lightning/trainer/trainer.py#L2043-L2047
Alternative
parallel_device_ids
are meant to return GPU device indexes. You can use the logic belowMotivation
Part of the follow ups in #11449 are to deprecate unused properties in accelerator_connector.
The accelerator properties are not meant to be public, migrated its internal usage. The only internal usage in
logger_connector
is planned to remove in v1.6.https://github.com/PyTorchLightning/pytorch-lightning/blob/bc191af178474eccdd912059f728cb4db22dd0d8/pytorch_lightning/trainer/connectors/logger_connector/logger_connector.py#L226
The
Trainer.data_parallel_device_ids
were also deprecated in favor of getting the GPU device indexes fromTrainer.strategy.parallel_devices
directly. The only Internal usage is planned to remove in v1.6, other usages are in tests.https://github.com/PyTorchLightning/pytorch-lightning/blob/bc191af178474eccdd912059f728cb4db22dd0d8/pytorch_lightning/callbacks/gpu_stats_monitor.py#L136
Pitch
https://github.com/PyTorchLightning/pytorch-lightning/blob/550d3a640d1bfeb731a16b2996d3160f0f7eb071/pytorch_lightning/trainer/trainer.py#L2043-L2047
https://github.com/PyTorchLightning/pytorch-lightning/blob/4c4b9d540f065c41ad288a78110cecbf45b96409/pytorch_lightning/trainer/connectors/logger_connector/logger_connector.py#L226
cc @justusschock @awaelchli @rohitgr7 @tchaton @akihironitta
The text was updated successfully, but these errors were encountered: