[1.3.7] - 2021-06-22
Fixed
- Fixed a bug where skipping an optimizer while using amp causes amp to trigger an assertion error (#7975)
This conversation was marked as resolved by carmocca
- Fixed deprecation messages not showing due to incorrect stacklevel (#8002, #8005)
- Fixed setting a
DistributedSampler
when using a distributed plugin in a custom accelerator (#7814)
- Improved
PyTorchProfiler
chrome traces names (#8009)
- Fixed moving the best score to device in
EarlyStopping
callback for TPU devices (#7959)
Contributors
@yifuwang @kaushikb11 @ajtritt @carmocca @tchaton