-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] Fix PyTorch1.11 Dist Remove _sync_params #1816
Conversation
Should involve downstream repos to test with this PR. |
if (getattr(self, 'require_forward_param_sync', False) | ||
and self.require_forward_param_sync): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
require_forward_param_sync
is a attribute of DDP
since torch1.2.0 so do we still need to use getattr
to get it rather than self.require_forward_param_sync
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please @luopeichao have a look.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It works in parrots. LGTM.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And parrots also support self.require_forward_param_sync
.
mmcv/mmcv/parallel/distributed.py Line 72 in 1a2f174
|
Motivation
Fix PyTorch1.11 Dist Remove _sync_params.
Modification
As above.
Checklist
Before PR:
After PR: