This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 123
Is higher compatibale with distributed data parallel DDP ? #98
Labels
wontfix
This will not be worked on
Comments
related: tristandeleu/pytorch-meta#100 |
related: #99 |
I'm afraid not. See answer to #99. |
related: tristandeleu/pytorch-meta#116 |
@albanD is there progress on this? (sorry for the direct tag) |
I don't know of any change in DDP towards this recently. But I'm not aware of all that happens there. |
This was referenced Sep 28, 2021
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
like this one: https://pytorch.org/tutorials/intermediate/ddp_tutorial.html
The text was updated successfully, but these errors were encountered: