Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Mar 27, 2023
1 parent 2b82b98 commit c048fba
Showing 1 changed file with 17 additions and 28 deletions.
45 changes: 17 additions & 28 deletions src/lightning_fabric/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,7 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).


## [1.9.4] - 2023-03-01
## \[1.9.4\] - 2023-03-01

### Added

Expand All @@ -17,21 +16,18 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed DDP spawn hang on TPU Pods ([#16844](https://github.com/Lightning-AI/lightning/pull/16844))
- Fixed an error when passing `find_usable_cuda_devices(num_devices=-1)` ([#16866](https://github.com/Lightning-AI/lightning/pull/16866))


## [1.9.3] - 2023-02-21
## \[1.9.3\] - 2023-02-21

### Fixed

- Fixed an issue causing a wrong environment plugin to be selected when `accelerator=tpu` and `devices > 1` ([#16806](https://github.com/Lightning-AI/lightning/pull/16806))
- Fixed parsing of defaults for `--accelerator` and `--precision` in Fabric CLI when `accelerator` and `precision` are set to non-default values in the code ([#16818](https://github.com/Lightning-AI/lightning/pull/16818))


## [1.9.2] - 2023-02-15
## \[1.9.2\] - 2023-02-15

- Fixed an attribute error and improved input validation for invalid strategy types being passed to Fabric ([#16693](https://github.com/Lightning-AI/lightning/pull/16693))


## [1.9.1] - 2023-02-10
## \[1.9.1\] - 2023-02-10

### Fixed

Expand All @@ -41,8 +37,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Fixed the batch_sampler reference for DataLoaders wrapped with XLA's MpDeviceLoader ([#16571](https://github.com/Lightning-AI/lightning/pull/16571))
- Fixed an import error when `torch.distributed` is not available ([#16658](https://github.com/Lightning-AI/lightning/pull/16658))


## [1.9.0] - 2023-01-17
## \[1.9.0\] - 2023-01-17

### Added

Expand All @@ -54,12 +49,12 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Added basic support for LightningModules ([#16048](https://github.com/Lightning-AI/lightning/issues/16048))
- Added support for managing callbacks via `Fabric(callbacks=...)` and emitting events through `Fabric.call()` ([#16074](https://github.com/Lightning-AI/lightning/issues/16074))
- Added Logger support ([#16121](https://github.com/Lightning-AI/lightning/issues/16121))
* Added `Fabric(loggers=...)` to support different Logger frameworks in Fabric
* Added `Fabric.log` for logging scalars using multiple loggers
* Added `Fabric.log_dict` for logging a dictionary of multiple metrics at once
* Added `Fabric.loggers` and `Fabric.logger` attributes to access the individual logger instances
* Added support for calling `self.log` and `self.log_dict` in a LightningModule when using Fabric
* Added access to `self.logger` and `self.loggers` in a LightningModule when using Fabric
- Added `Fabric(loggers=...)` to support different Logger frameworks in Fabric
- Added `Fabric.log` for logging scalars using multiple loggers
- Added `Fabric.log_dict` for logging a dictionary of multiple metrics at once
- Added `Fabric.loggers` and `Fabric.logger` attributes to access the individual logger instances
- Added support for calling `self.log` and `self.log_dict` in a LightningModule when using Fabric
- Added access to `self.logger` and `self.loggers` in a LightningModule when using Fabric
- Added `lightning_fabric.loggers.TensorBoardLogger` ([#16121](https://github.com/Lightning-AI/lightning/issues/16121))
- Added `lightning_fabric.loggers.CSVLogger` ([#16346](https://github.com/Lightning-AI/lightning/issues/16346))
- Added support for a consistent `.zero_grad(set_to_none=...)` on the wrapped optimizer regardless of which strategy is used ([#16275](https://github.com/Lightning-AI/lightning/issues/16275))
Expand All @@ -83,39 +78,33 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Restored sampling parity between PyTorch and Fabric dataloaders when using the `DistributedSampler` ([#16101](https://github.com/Lightning-AI/lightning/issues/16101))
- Fixes an issue where the error message wouldn't tell the user the real value that was passed through the CLI ([#16334](https://github.com/Lightning-AI/lightning/issues/16334))


## [1.8.6] - 2022-12-21
## \[1.8.6\] - 2022-12-21

- minor cleaning


## [1.8.5] - 2022-12-15
## \[1.8.5\] - 2022-12-15

- minor cleaning


## [1.8.4] - 2022-12-08
## \[1.8.4\] - 2022-12-08

### Fixed

- Fixed `shuffle=False` having no effect when using DDP/DistributedSampler ([#15931](https://github.com/Lightning-AI/lightning/issues/15931))


## [1.8.3] - 2022-11-22
## \[1.8.3\] - 2022-11-22

### Changed

- Temporarily removed support for Hydra multi-run ([#15737](https://github.com/Lightning-AI/lightning/pull/15737))


## [1.8.2] - 2022-11-17
## \[1.8.2\] - 2022-11-17

### Fixed

- Fixed the automatic fallback from `LightningLite(strategy="ddp_spawn", ...)` to `LightningLite(strategy="ddp", ...)` when on an LSF cluster ([#15103](https://github.com/PyTorchLightning/pytorch-lightning/issues/15103))


## [1.8.1] - 2022-11-10
## \[1.8.1\] - 2022-11-10

### Fixed

Expand Down

0 comments on commit c048fba

Please sign in to comment.