Skip to content

Commit

Permalink
Remove unused on_train_epoch_end hook in accelerator (#9035)
Browse files Browse the repository at this point in the history
  • Loading branch information
ananthsub authored Aug 22, 2021
1 parent 930b81f commit 8a93173
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 4 deletions.
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -166,6 +166,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Removed deprecated `connect_precision_plugin` and `connect_training_type_plugin` from `Accelerator` ([#9019](https://github.com/PyTorchLightning/pytorch-lightning/pull/9019))


- Removed `on_train_epoch_end` from `Accelerator` ([#9035](https://github.com/PyTorchLightning/pytorch-lightning/pull/9035))


### Fixed

- Ensure the existence of `DDPPlugin._sync_dir` in `reconciliate_processes` ([#8939](https://github.com/PyTorchLightning/pytorch-lightning/pull/8939))
Expand Down
4 changes: 0 additions & 4 deletions pytorch_lightning/accelerators/accelerator.py
Original file line number Diff line number Diff line change
Expand Up @@ -479,10 +479,6 @@ def restore_checkpoint_after_pre_dispatch(self) -> bool:
def update_global_step(self, total_batch_idx: int, current_global_step: int) -> int:
return self.training_type_plugin.update_global_step(total_batch_idx, current_global_step)

def on_train_epoch_end(self) -> None:
"""Hook to do something on the end of an training epoch."""
pass

def on_train_start(self) -> None:
"""Called when train begins."""
return self.training_type_plugin.on_train_start()
Expand Down

0 comments on commit 8a93173

Please sign in to comment.