Skip to content

Commit

Permalink
Merge branch 'lite' into lite-deepspeed-hack
Browse files Browse the repository at this point in the history
  • Loading branch information
awaelchli authored Oct 18, 2021
2 parents 0f4b790 + baedcf7 commit bed73b8
Show file tree
Hide file tree
Showing 43 changed files with 728 additions and 319 deletions.
10 changes: 10 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -194,6 +194,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added `strategy` argument to Trainer ([#8597](https://github.com/PyTorchLightning/pytorch-lightning/pull/8597))


- LightningLite:
* Added `PrecisionPlugin.forward_context`, making it the default implementation for all `{train,val,test,predict}_step_context()` methods ([#9988](https://github.com/PyTorchLightning/pytorch-lightning/pull/9988))


### Changed

- Setting `Trainer(accelerator="ddp_cpu")` now does not spawn a subprocess if `num_processes` is kept `1` along with `num_nodes > 1` ([#9603](https://github.com/PyTorchLightning/pytorch-lightning/pull/9603)).
Expand Down Expand Up @@ -527,11 +531,15 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed `broadcast` in `DDPPlugin` and ``DDPSpawnPlugin` to respect the `src` input ([#9691](https://github.com/PyTorchLightning/pytorch-lightning/pull/9691))


- Fixed `self.log(on_epoch=True)` for the `on_batch_start` and `on_train_batch_start` hooks ([#9780](https://github.com/PyTorchLightning/pytorch-lightning/pull/9780))


- Fixed restoring training state during `trainer.fit` only ([#9413](https://github.com/PyTorchLightning/pytorch-lightning/pull/9413))


- Fixed DeepSpeed and Lightning both calling the scheduler ([#9788](https://github.com/PyTorchLightning/pytorch-lightning/pull/9788))


- Fixed missing arguments when saving hyperparameters from the parent class but not from the child class ([#9800](https://github.com/PyTorchLightning/pytorch-lightning/pull/9800))


Expand All @@ -546,6 +554,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Fixed issue with non-init dataclass fields in `apply_to_collection` ([#9963](https://github.com/PyTorchLightning/pytorch-lightning/issues/9963))

- Reset `val_dataloader` in `tuner/batch_size_scaling` for binsearch ([#9975](https://github.com/PyTorchLightning/pytorch-lightning/pull/9975))


## [1.4.9] - 2021-09-30

Expand Down
2 changes: 1 addition & 1 deletion docs/source/advanced/sequences.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@

Sequential Data
================
===============

Truncated Backpropagation Through Time
--------------------------------------
Expand Down
65 changes: 65 additions & 0 deletions docs/source/api_references.rst
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,71 @@ Loggers API
test_tube
wandb

Loop API
--------

Base Classes
^^^^^^^^^^^^

.. currentmodule:: pytorch_lightning.loops

.. autosummary::
:toctree: api
:nosignatures:
:template: classtemplate.rst

~base.Loop
~dataloader.dataloader_loop.DataLoaderLoop


Default Loop Implementations
^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Training
""""""""

.. currentmodule:: pytorch_lightning.loops

.. autosummary::
:toctree: api
:nosignatures:
:template: classtemplate.rst

FitLoop
~epoch.TrainingEpochLoop
~batch.TrainingBatchLoop
~optimization.OptimizerLoop
~optimization.ManualOptimization


Validation and Testing
""""""""""""""""""""""

.. currentmodule:: pytorch_lightning.loops

.. autosummary::
:toctree: api
:nosignatures:
:template: classtemplate.rst

~dataloader.EvaluationLoop
~epoch.EvaluationEpochLoop


Prediction
""""""""""

.. currentmodule:: pytorch_lightning.loops

.. autosummary::
:toctree: api
:nosignatures:
:template: classtemplate.rst

~dataloader.PredictionLoop
~epoch.PredictionEpochLoop


Plugins API
-----------

Expand Down
Loading

0 comments on commit bed73b8

Please sign in to comment.