-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid circular imports when lightning-habana or lightning-graphcore is installed #18226
Conversation
⚡ Required checks status: All passing 🟢Groups summary🟢 pytorch_lightning: Tests workflow
These checks are required after the changes to 🟢 pytorch_lightning: Azure GPU
These checks are required after the changes to 🟢 pytorch_lightning: Benchmarks
These checks are required after the changes to 🟢 pytorch_lightning: Docs
These checks are required after the changes to 🟢 mypy
These checks are required after the changes to 🟢 installThese checks are required after the changes to Thank you for your contribution! 💜
|
for more information, see https://pre-commit.ci
def _lightning_graphcore_available() -> bool: | ||
# This is defined as a function instead of a constant to avoid circular imports, because `lightning_graphcore` | ||
# also imports Lightning | ||
return RequirementCache("lightning-graphcore") and _try_import_module("lightning_graphcore") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of this could probably be replaced with RequirementCache("lightning-graphcore", "lightning_graphcore") which is already lazy
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The try_import_module
came from #17829 which relaxes the import error with a warning, so I assumed we would want to keep it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The second argument in RequirementCache will try to import the module by name if the package check fails. It was added for cases like this one where the package is properly installed but not importable or viceversa. You'd want to test it though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good, I tested and this still results in errors. Until these packages resolve the broken API calls or add logic to be compatible with multiple lightning versions, I think we need to go with this relaxed import for now.
/Users/adrian/miniconda3/envs/lightning/bin/python /Users/adrian/repositories/lightning/examples/pytorch/bug_report/bug_report_model.py
Traceback (most recent call last):
File "/Users/adrian/repositories/lightning/examples/pytorch/bug_report/bug_report_model.py", line 66, in <module>
run()
File "/Users/adrian/repositories/lightning/examples/pytorch/bug_report/bug_report_model.py", line 52, in run
trainer = Trainer(
File "/Users/adrian/repositories/lightning/src/lightning/pytorch/utilities/argparse.py", line 69, in insert_env_defaults
return fn(self, **kwargs)
File "/Users/adrian/repositories/lightning/src/lightning/pytorch/trainer/trainer.py", line 401, in __init__
self._accelerator_connector = _AcceleratorConnector(
File "/Users/adrian/repositories/lightning/src/lightning/pytorch/trainer/connectors/accelerator_connector.py", line 123, in __init__
_register_external_accelerators_and_strategies()
File "/Users/adrian/repositories/lightning/src/lightning/pytorch/trainer/connectors/accelerator_connector.py", line 698, in _register_external_accelerators_and_strategies
from lightning_habana import HPUAccelerator, HPUParallelStrategy, SingleHPUStrategy
File "/Users/adrian/miniconda3/envs/lightning/lib/python3.10/site-packages/lightning_habana/__init__.py", line 19, in <module>
from lightning_habana.pytorch.accelerator import HPUAccelerator
File "/Users/adrian/miniconda3/envs/lightning/lib/python3.10/site-packages/lightning_habana/pytorch/__init__.py", line 20, in <module>
from lightning_habana.pytorch.strategies.deepspeed import HPUDeepSpeedStrategy
File "/Users/adrian/miniconda3/envs/lightning/lib/python3.10/site-packages/lightning_habana/pytorch/strategies/__init__.py", line 15, in <module>
from lightning_habana.pytorch.strategies.deepspeed import HPUDeepSpeedStrategy
File "/Users/adrian/miniconda3/envs/lightning/lib/python3.10/site-packages/lightning_habana/pytorch/strategies/deepspeed.py", line 42, in <module>
from lightning.pytorch.overrides.base import _LightningModuleWrapperBase, _LightningPrecisionModuleWrapperBase
ModuleNotFoundError: No module named 'lightning.pytorch.overrides.base'
ecf8889
to
76e4c10
Compare
@awaelchli could you pls check and eventually cherry-pick this into #18420 |
What does this PR do?
Fixes #17897
Fixes #17810
Our CI has several warnings about the circular import. To reproduce:
pip install lightning-habana
or lightning-graphcoreimport lightning as L
This PR makes the import checks lazy, so they don't run during import resolution, and only when they are needed.
cc @carmocca @Borda