Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Is higher compatibale with distributed data parallel DDP ? #98

Closed
brando90 opened this issue Feb 17, 2021 · 8 comments
Closed

Is higher compatibale with distributed data parallel DDP ? #98

brando90 opened this issue Feb 17, 2021 · 8 comments
Labels
wontfix This will not be worked on

Comments

@brando90
Copy link

like this one: https://pytorch.org/tutorials/intermediate/ddp_tutorial.html

@brando90
Copy link
Author

related: tristandeleu/pytorch-meta#100

@brando90
Copy link
Author

related: #99

@egrefen
Copy link
Contributor

egrefen commented Feb 23, 2021

I'm afraid not. See answer to #99.

@egrefen egrefen closed this as completed Feb 23, 2021
@egrefen egrefen added the wontfix This will not be worked on label Feb 23, 2021
@brando90
Copy link
Author

brando90 commented Sep 9, 2021

related: tristandeleu/pytorch-meta#116

@brando90
Copy link
Author

brando90 commented Sep 9, 2021

@albanD is there progress on this? (sorry for the direct tag)

@albanD
Copy link

albanD commented Sep 9, 2021

I don't know of any change in DDP towards this recently. But I'm not aware of all that happens there.
cc @mrshenli anything that I missed?

@brando90
Copy link
Author

brando90 commented Sep 28, 2021

I don't know of any change in DDP towards this recently. But I'm not aware of all that happens there. cc @mrshenli anything that I missed?

Hi!

has there been any progress on this? @mrshenli

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

3 participants