Skip to content

Commit

Permalink
releasing 2.0.8
Browse files Browse the repository at this point in the history
  • Loading branch information
Borda authored and lantiga committed Aug 30, 2023
1 parent efa83da commit 8345689
Show file tree
Hide file tree
Showing 6 changed files with 9 additions and 32 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Organizing your code into Lightning components offers these benefits:
if you know what you are doing, Lightning gives you full control to manage your own
scaling logic, fault-tolerance and even pre-provisioning, all from Python. We even give you
full flexibility to use tools like `terraform <../../cloud/customize_a_lightning_cluster.html>`_ to optimize cloud clusters for your Lightning apps.
full flexibility to use tools like :doc:`terraform <../../cloud/customize_a_lightning_cluster>` to optimize cloud clusters for your Lightning apps.

.. collapse:: Integrate into your current workflow tools

Expand Down
7 changes: 2 additions & 5 deletions src/lightning/app/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,13 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

## [UnREleased] - 2023-08-DD
## [2.0.8] - 2023-08-29

## Canaged
## Changed

- Change top folder ([#18212](https://github.com/Lightning-AI/lightning/pull/18212))


- Remove `_handle_is_headless` calls in app run loop ([#18362](https://github.com/Lightning-AI/lightning/pull/18362))


### Fixed

- refactor path to root preventing circular import ([#18357](https://github.com/Lightning-AI/lightning/pull/18357))
Expand Down
11 changes: 2 additions & 9 deletions src/lightning/fabric/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,24 +4,17 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

## [UnReleased] - 2023-08-DD
## [2.0.8] - 2023-08-29

### Chnaged
### Changed

- On XLA, avoid setting the global rank before processes have been launched as this will initialize the PJRT computation client in the main process ([#16966](https://github.com/Lightning-AI/lightning/pull/16966))


### Fixed

- Fixed model parameters getting shared between processes when running with `strategy="ddp_spawn"` and `accelerator="cpu"`; this has a necessary memory impact, as parameters are replicated for each process now ([#18238](https://github.com/Lightning-AI/lightning/pull/18238))


- Removed false positive warning when using `fabric.no_backward_sync` with XLA strategies ([#17761](https://github.com/Lightning-AI/lightning/pull/17761))


- Fixed issue where Fabric would not initialize the global rank, world size, and rank-zero-only rank after initialization and before launch ([#16966](https://github.com/Lightning-AI/lightning/pull/16966))


- Fixed FSDP full-precision `param_dtype` training (`16-mixed`, `bf16-mixed` and `32-true` configurations) to avoid FSDP assertion errors with PyTorch < 2.0 ([#18278](https://github.com/Lightning-AI/lightning/pull/18278))


Expand Down
17 changes: 2 additions & 15 deletions src/lightning/pytorch/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,33 +5,20 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).


## [UnRaleased] - 2023-08-DD
## [2.0.8] - 2023-08-29

### Chnaged
### Changed

- On XLA, avoid setting the global rank before processes have been launched as this will initialize the PJRT computation client in the main process ([#16966](https://github.com/Lightning-AI/lightning/pull/16966))


- Fix inefficiency in rich progress bar ([#18369](https://github.com/Lightning-AI/lightning/pull/18369))


### Fixed

- Fixed FSDP full-precision `param_dtype` training (`16-mixed` and `bf16-mixed` configurations) to avoid FSDP assertion errors with PyTorch < 2.0 ([#18278](https://github.com/Lightning-AI/lightning/pull/18278))


- Fixed an issue that prevented the use of custom logger classes without an `experiment` property defined ([#18093](https://github.com/Lightning-AI/lightning/pull/18093))


- Fixed setting the tracking uri in `MLFlowLogger` for logging artifacts to the MLFlow server ([#18395](https://github.com/Lightning-AI/lightning/pull/18395))


- Fixed redundant `iter()` call to dataloader when checking dataloading configuration ([#18415](https://github.com/Lightning-AI/lightning/pull/18415))


- Fixed model parameters getting shared between processes when running with `strategy="ddp_spawn"` and `accelerator="cpu"`; this has a necessary memory impact, as parameters are replicated for each process now ([#18238](https://github.com/Lightning-AI/lightning/pull/18238))


- Properly manage `fetcher.done` with `dataloader_iter` ([#18376](https://github.com/Lightning-AI/lightning/pull/18376))


Expand Down
2 changes: 1 addition & 1 deletion src/lightning/pytorch/_graveyard/_torchmetrics.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ def compare_version(package: str, op: Callable, version: str, use_base_version:
# https://github.com/Lightning-AI/metrics/blob/v0.7.3/torchmetrics/metric.py#L96
with contextlib.suppress(AttributeError):
if hasattr(torchmetrics.utilities.imports, "_compare_version"):
torchmetrics.utilities.imports._compare_version = compare_version
torchmetrics.utilities.imports._compare_version = compare_version # type: ignore[assignment]

with contextlib.suppress(AttributeError):
if hasattr(torchmetrics.metric, "_compare_version"):
Expand Down
2 changes: 1 addition & 1 deletion src/version.info
Original file line number Diff line number Diff line change
@@ -1 +1 @@
2.0.7
2.0.8

0 comments on commit 8345689

Please sign in to comment.