-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support auto_fp16 using torch.cuda.amp when PyTorch >= 1.6.0 #951
Conversation
Codecov Report
@@ Coverage Diff @@
## master #951 +/- ##
==========================================
- Coverage 65.64% 64.97% -0.67%
==========================================
Files 149 151 +2
Lines 9455 9674 +219
Branches 1722 1755 +33
==========================================
+ Hits 6207 6286 +79
- Misses 2928 3062 +134
- Partials 320 326 +6
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Benchmark of MMDetection3D with PyTorch 1.8:
Benchmark of MMDetection3D with PyTorch 1.5:
Benchmark of MMDetection with PyTorch 1.8:
Benchmark of MMDetection with PyTorch 1.5:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Verified in MMSeg
"""Copy updated params from fp32 weight copy to fp16 model.""" | ||
for fp16_param, fp32_param in zip(fp16_net.parameters(), fp32_weights): | ||
fp16_param.data.copy_(fp32_param.data) | ||
if TORCH_VERSION != 'parrots' and TORCH_VERSION >= '1.6.0': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Last comment: The indentation is a little bit annoying. Can we create two hooks first, namely PT16Fp16OptimizerHook
and PT15Fp16OptimizerHook
, then assign one of them to Fp16OptimizerHook
according to the version?
This PR enables PyTorch official implementation for automatic mixed-precision training.
It replaces the original Pull Request due to some accident.