Add a is_distrubted
property to Trainer
or LightningModule
.
#7225
richarddwang
started this conversation in
Idea pool
Replies: 1 comment 1 reply
-
We already have such an attribute, however it is not that well exposed to the user: |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am writing code that can run whether data parallelism is used,
without changing any single character of code.
So I have to write cross entropy loss logic under, instead of
training_step
,training_step_end
,whose input is just output of
training_step
,or outputs from all devices when data parallelism is used.
Something like this will be very helpful then.
Note: We can't use
isinstance(all_device_outputs, list)
to check whether it is distributed. Considering output ofxxx_step
is just a list and dp is not used, it will be mistaken as dp used.If it is ok, I would be happy to make a pr.
Beta Was this translation helpful? Give feedback.
All reactions