Skip to content

Commit

Permalink
update changelog and deprecation test
Browse files Browse the repository at this point in the history
  • Loading branch information
DuYicong515 committed Mar 25, 2022
1 parent aa84b4c commit 23a2f5e
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 1 deletion.
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -600,6 +600,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Deprecated passing only the callback state to `Callback.on_load_checkpoint(callback_state)` in favor of passing the callback state to `Callback.load_state_dict` and in 1.8, passing the entire checkpoint dictionary to `Callback.on_load_checkpoint(checkpoint)` ([#11887](https://github.com/PyTorchLightning/pytorch-lightning/pull/11887))


- Deprecated `Trainer.tpu_cores` in favor of `Trainer.num_devices` ([#12437](https://github.com/PyTorchLightning/pytorch-lightning/pull/12437))


### Removed

- Removed deprecated parameter `method` in `pytorch_lightning.utilities.model_helpers.is_overridden` ([#10507](https://github.com/PyTorchLightning/pytorch-lightning/pull/10507))
Expand Down Expand Up @@ -811,6 +814,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Removed `AcceleratorConnector.parallel_devices` property ([#12075](https://github.com/PyTorchLightning/pytorch-lightning/pull/12075))


- Removed `AcceleratorConnector.tpu_cores` property ([#12437](https://github.com/PyTorchLightning/pytorch-lightning/pull/12437))


### Fixed

- Fixed an issue where `ModelCheckpoint` could delete older checkpoints when `dirpath` has changed during resumed training ([#12045](https://github.com/PyTorchLightning/pytorch-lightning/pull/12045))
Expand Down
2 changes: 1 addition & 1 deletion pytorch_lightning/callbacks/xla_stats_monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ def on_train_start(self, trainer: "pl.Trainer", pl_module: "pl.LightningModule")
if isinstance(trainer.accelerator, TPUAccelerator):
raise MisconfigurationException(
"You are using XLAStatsMonitor but are not running on TPU."
f" The Trainer accelerator type is set to {trainer.accelerator.name().upper()}."
f" The accelerator type is set to {trainer.accelerator.name().upper()}."
)

device = trainer.strategy.root_device
Expand Down
6 changes: 6 additions & 0 deletions tests/accelerators/test_tpu.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,12 @@ def test_accelerator_tpu(accelerator, devices):
assert isinstance(trainer.accelerator, TPUAccelerator)
assert isinstance(trainer.strategy, TPUSpawnStrategy)
assert trainer.num_devices == 8
with pytest.deprecated_call(
match= "`Trainer.tpu_cores` is deprecated in v1.6 and will be removed in v1.8. "
"Please use `Trainer.devices` instead."
):
trainer.tpu_cores == 8



@RunIf(tpu=True)
Expand Down

0 comments on commit 23a2f5e

Please sign in to comment.