Skip to content

Standard weekly patch release

Compare
Choose a tag to compare
@SeanNaren SeanNaren released this 16 Mar 20:29

[1.2.4] - 2021-03-16

Changed

  • Changed the default of find_unused_parameters back to True in DDP and DDP Spawn (#6438)

Fixed

  • Expose DeepSpeed loss parameters to allow users to fix loss instability (#6115)
  • Fixed DP reduction with collection (#6324)
  • Fixed an issue where the tuner would not tune the learning rate if also tuning the batch size (#4688)
  • Fixed broadcast to use PyTorch broadcast_object_list and add reduce_decision (#6410)
  • Fixed logger creating directory structure too early in DDP (#6380)
  • Fixed DeepSpeed additional memory use on rank 0 when default device not set early enough (#6460)
  • Fixed DummyLogger.log_hyperparams raising a TypeError when running with fast_dev_run=True (#6398)
  • Fixed an issue with Tuner.scale_batch_size not finding the batch size attribute in the datamodule (#5968)
  • Fixed an exception in the layer summary when the model contains torch.jit scripted submodules (#6511)
  • Fixed when Train loop config was run during Trainer.predict (#6541)

Contributors

@awaelchli, @kaushikb11, @Palzer, @SeanNaren, @tchaton

If we forgot someone due to not matching commit email with GitHub account, let us know :]