Skip to content

Commit

Permalink
Merge branch 'master' into bugfix/logger-attributeerror
Browse files Browse the repository at this point in the history
  • Loading branch information
awaelchli authored Aug 18, 2022
2 parents 7d5016d + 047f0aa commit 5880e28
Show file tree
Hide file tree
Showing 8 changed files with 24 additions and 150 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci-pytorch-test-conda.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ jobs:
- {python-version: "3.8", pytorch-version: "1.10"}
- {python-version: "3.9", pytorch-version: "1.11"}
- {python-version: "3.9", pytorch-version: "1.12"}
timeout-minutes: 30
timeout-minutes: 40

steps:
- name: Workaround for https://github.com/actions/checkout/issues/760
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/docs-deploy.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name: "Deploy Docs"
on:
push:
branches: [master]
branches: ["release/app"]

jobs:
# https://github.com/marketplace/actions/deploy-to-github-pages
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -58,64 +58,3 @@ Run the Lightning App on the cloud:
.. code:: bash
lightning run app app.py --cloud
----

*************************************
Build a Lightning App from a template
*************************************
If you didn't find an Lightning App similar to the one you need (in the `Lightning App gallery <https://lightning.ai/apps>`_), another option is to start from a template.
The Lightning CLI can generate a template with built-in testing that can be easily published to the
Lightning App Gallery.

Generate a Lightning App with our template generator:

.. code:: bash
lightning init app your-app-name
You'll see a print-out like this:

.. code:: bash
➜ lightning init app your-app-name
/Users/Your/Current/dir/your-app-name
INFO: laying out app template at /Users/Your/Current/dir/your-app-name
INFO:
Lightning app template created!
/Users/Your/Current/dir/your-app-name
run your app with:
lightning run app your-app-name/app.py
run it on the cloud to share with your collaborators:
lightning run app your-app-name/app.py --cloud
----

Modify the Lightning App template
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The command above generates a Lightning App file like this:

.. code:: python
from your_app_name import ComponentA, ComponentB
import lightning as L
class LitApp(L.LightningFlow):
def __init__(self) -> None:
super().__init__()
self.component_a = ComponentA()
self.component_b = ComponentB()
def run(self):
self.component_a.run()
self.component_b.run()
app = L.LightningApp(LitApp())
Now you can add your own components as you wish!
Original file line number Diff line number Diff line change
Expand Up @@ -151,50 +151,3 @@ run the app
.. code:: bash
lightning run app app.py
----

*******************************************
Build a Lightning component from a template
*******************************************
If you'd prefer a component template with built-in testing that can be easily published to the
Lightning component gallery, generate it with our template generator:

.. code:: bash
lightning init component your-component-name
You'll see a print-out like this:

.. code:: bash
➜ lightning init component your-component-name
INFO: laying out component template at /Users/williamfalcon/Developer/opensource/_/lightning/scratch/hello-world
INFO:
⚡ Lightning component template created!
/Users/williamfalcon/Developer/opensource/_/lightning/scratch/hello-world
...
----

Modify the component template
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The command above generates a component file like this:

.. code:: python
import lightning as L
class TemplateComponent(L.LightningWork):
def __init__(self) -> None:
super().__init__()
self.value = 0
def run(self):
self.value += 1
print("welcome to your work component")
print("this is running inside a work")
Now you can modify the component as you wish!
2 changes: 1 addition & 1 deletion requirements/pytorch/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ torch>=1.9.*, <=1.12.0
tqdm>=4.57.0, <4.65.0
PyYAML>=5.4, <=6.0
fsspec[http]>=2021.05.0, !=2021.06.0, <2022.6.0
tensorboard>=2.9.1, <2.10.0
tensorboard>=2.9.1, <2.11.0
torchmetrics>=0.7.0, <0.9.3 # needed for using fixed compare_version
pyDeprecate>=0.3.1, <=0.3.2
packaging>=17.0, <=21.3
Expand Down
50 changes: 16 additions & 34 deletions src/pytorch_lightning/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

### Added

- Added `FullyShardedNativeNativeMixedPrecisionPlugin` to handle precision for `DDPFullyShardedNativeStrategy` ([#14092](https://github.com/Lightning-AI/lightning/pull/14092))


- Added prefix to log message in `seed_everything` with rank info ([#13290](https://github.com/Lightning-AI/lightning/issues/13290))


- Added profiling to these hooks: `on_before_batch_transfer`, `transfer_batch_to_device`, `on_after_batch_transfer`, `configure_gradient_clipping`, `clip_gradients` ([#14069](https://github.com/Lightning-AI/lightning/pull/14069))


-


Expand All @@ -28,17 +23,9 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
- Raised a `MisconfigurationException` if batch transfer hooks are overriden with `IPUAccelerator` ([#13961](https://github.com/Lightning-AI/lightning/pull/13961))


- Updated compatibility for LightningLite to run with the latest DeepSpeed 0.7.0 ([13967](https://github.com/Lightning-AI/lightning/pull/13967))


- Replaced the unwrapping logic in strategies with direct access to unwrapped `LightningModule` ([#13738](https://github.com/Lightning-AI/lightning/pull/13738))


- The `WandbLogger.name` property no longer returns the name of the experiment, and instead returns the project's name ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))


- The default project name in `WandbLogger` is now "lightning_logs" ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))


### Deprecated

Expand Down Expand Up @@ -77,46 +64,41 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

### Fixed

- Fixed a bug that caused spurious `AttributeError` when multiple `DataLoader` classes are imported ([#14117](https://github.com/Lightning-AI/lightning/pull/14117))
- Fixed an assertion error when using a `ReduceOnPlateau` scheduler with the Horovod strategy ([#14215](https://github.com/Lightning-AI/lightning/pull/14215))


- Fixed epoch-end logging results not being reset after the end of the epoch ([#14061](https://github.com/Lightning-AI/lightning/pull/14061))
- Fixed an `AttributeError` when accessing `LightningModule.logger` and the Trainer has multiple loggers ([#14234](https://github.com/Lightning-AI/lightning/pull/14234))


- Fixed resuming from a checkpoint when using Stochastic Weight Averaging (SWA) ([#9938](https://github.com/Lightning-AI/lightning/pull/9938))
## [1.7.2] - 2022-08-17

### Added

- Fixed the device placement when `LightningModule.cuda()` gets called without specifying a device index and the current cuda device was not 0 ([#14128](https://github.com/Lightning-AI/lightning/pull/14128))
- Added `FullyShardedNativeNativeMixedPrecisionPlugin` to handle precision for `DDPFullyShardedNativeStrategy` ([#14092](https://github.com/Lightning-AI/lightning/pull/14092))
- Added profiling to these hooks: `on_before_batch_transfer`, `transfer_batch_to_device`, `on_after_batch_transfer`, `configure_gradient_clipping`, `clip_gradients` ([#14069](https://github.com/Lightning-AI/lightning/pull/14069))

### Changed

- Avoided false positive warning about using `sync_dist` when using torchmetrics ([#14143](https://github.com/Lightning-AI/lightning/pull/14143))
- The `WandbLogger.name` property no longer returns the name of the experiment, and instead returns the project's name ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))
- The default project name in `WandbLogger` is now "lightning_logs" ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))
- Updated compatibility for LightningLite to run with the latest DeepSpeed 0.7.0 ([13967](https://github.com/Lightning-AI/lightning/pull/13967))

### Fixed

- Fixed a bug that caused spurious `AttributeError` when multiple `DataLoader` classes are imported ([#14117](https://github.com/Lightning-AI/lightning/pull/14117))
- Fixed epoch-end logging results not being reset after the end of the epoch ([#14061](https://github.com/Lightning-AI/lightning/pull/14061))
- Fixed resuming from a checkpoint when using Stochastic Weight Averaging (SWA) ([#9938](https://github.com/Lightning-AI/lightning/pull/9938))
- Fixed the device placement when `LightningModule.cuda()` gets called without specifying a device index and the current cuda device was not 0 ([#14128](https://github.com/Lightning-AI/lightning/pull/14128))
- Avoided false positive warning about using `sync_dist` when using torchmetrics ([#14143](https://github.com/Lightning-AI/lightning/pull/14143))
- Avoid `metadata.entry_points` deprecation warning on Python 3.10 ([#14052](https://github.com/Lightning-AI/lightning/pull/14052))


- Fixed epoch-end logging results not being reset after the end of the epoch ([#14061](https://github.com/Lightning-AI/lightning/pull/14061))


- Avoid raising the sampler warning if num_replicas=1 ([#14097](https://github.com/Lightning-AI/lightning/pull/14097))


- Fixed saving hyperparameters in a composition where the parent class is not a `LightningModule` or `LightningDataModule` ([#14151](https://github.com/Lightning-AI/lightning/pull/14151))


- Avoided requiring the FairScale package to use precision with the fsdp native strategy ([#14092](https://github.com/Lightning-AI/lightning/pull/14092))


- Fixed an `AttributeError` when accessing `LightningModule.logger` and the Trainer has multiple loggers ([#14234](https://github.com/Lightning-AI/lightning/pull/14234))


- Fixed an issue in which the default name for a run in `WandbLogger` would be set to the project name instead of a randomly generated string ([#14145](https://github.com/Lightning-AI/lightning/pull/14145))


- Fixed not preserving set attributes on `DataLoader` and `BatchSampler` when instantiated inside `*_dataloader` hooks ([#14212](https://github.com/Lightning-AI/lightning/pull/14212))



## [1.7.1] - 2022-08-09

### Fixed
Expand Down
5 changes: 3 additions & 2 deletions src/pytorch_lightning/serve/servable_module_validator.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ def __init__(
server: Literal["fastapi", "ml_server", "torchserve", "sagemaker"] = "fastapi",
host: str = "127.0.0.1",
port: int = 8080,
timeout: int = 10,
timeout: int = 20,
exit_on_failure: bool = True,
):
super().__init__()
Expand Down Expand Up @@ -109,7 +109,8 @@ def on_train_start(self, trainer: "pl.Trainer", servable_module: "pl.LightningMo
except requests.exceptions.ConnectionError:
pass
if time.time() - t0 > self.timeout:
raise Exception(f"The Server didn't start in {self.timeout}")
process.kill()
raise Exception(f"The server didn't start within {self.timeout} seconds.")
time.sleep(0.1)

payload = servable_module.configure_payload()
Expand Down
5 changes: 2 additions & 3 deletions src/pytorch_lightning/strategies/horovod.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@
from pytorch_lightning.utilities.exceptions import MisconfigurationException
from pytorch_lightning.utilities.imports import _HOROVOD_AVAILABLE
from pytorch_lightning.utilities.rank_zero import rank_zero_only
from pytorch_lightning.utilities.types import _LRScheduler

if _HOROVOD_AVAILABLE:
import horovod.torch as hvd
Expand Down Expand Up @@ -114,8 +113,8 @@ def _unpack_lightning_optimizer(opt: Optimizer) -> Optimizer:
lr_scheduler_configs = self.lr_scheduler_configs
for config in lr_scheduler_configs:
scheduler = config.scheduler
assert isinstance(scheduler, _LRScheduler)
scheduler.base_lrs = [lr * self.world_size for lr in scheduler.base_lrs]
if hasattr(scheduler, "base_lrs"):
scheduler.base_lrs = [lr * self.world_size for lr in scheduler.base_lrs] # type: ignore[union-attr]

assert self.lightning_module is not None
# Horovod: broadcast parameters & optimizer state to ensure consistent initialization
Expand Down

0 comments on commit 5880e28

Please sign in to comment.