-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update pytorch lightning utilities #15310
Update pytorch lightning utilities #15310
Conversation
What about an option where you assign to |
I want to confirm what you mean by this, I guess I'm a little confused. This snippet is from ddpm, do you mean something like this: from torchvision.utils import make_grid
pytorch_lightning.utilities.distributed = pytorch_lightning.utilities.rank_zero
from pytorch_lightning.utilities.rank_zero import rank_zero_only
from omegaconf import ListConfig If so, this doesn't work because Another way I tried was this, but this doesn't seem to work: import pytorch_lightning.utilities.rank_zero as rank_zero
import pytorch_lightning.utilities.distributed as distributed
distributed = rank_zero
distributed.rank_zero_only() However, this still does require some initial modification of the
Perhaps, I am thinking about this or doing this incorrectly. This is another way I thought of but doesn't work: import pytorch_lightning.utilities.rank_zero as rank_zero
def distributed():
return rank_zero
from pytorch_lightning.utilities.distributed import rank_zero_only Something like this would work, but it does change the code (and is technically unnecessary since it doesn't do anything with import pytorch_lightning.utilities.rank_zero as distributed
from pytorch_lightning.utilities.rank_zero import rank_zero_only If you prefer, I could instead replace this with a try catch/except like this which does work: try:
from pytorch_lightning.utilities.distributed import rank_zero_only
except:
from pytorch_lightning.utilities.rank_zero import rank_zero_only Sorry for the wall, this was just me trying to figure out a way of doing this |
Something like this:
|
That'd totally work! I just skipped over it initially because I wasn't sure if you wanted to allow importing sys or not because it's not already being imported, will adapt it to current edits |
What I'm suggesting is to do this once, somewhere in code that's executed early (ie a function in You just gotta be careful and not replace |
That'd make more sense! I'll revert and work on implementing something like that |
Not sure if you want to keep the print statement in there or not as it's a bit unnecessary I feel like? |
Forgot some things to revert
Fully reverts this time
Sorry, I forgot to revert a few lines back, should be good now |
Sorry, a bit of a mess, the import was actually definitely required but I moved it further down into the if |
Description
Changes
pytorch_lightning.utilities.distributed
topytorch_lightning.utilities.rank_zero
as this is the new/proper place to getrank_zero_only
and does not exist indistributed
anymore. This also (mostly) fixes the issue of it not being able to startup on clean install for those who have this error.This change has existed for quite a few versions of pytorch lightning, and definitely exists in the current version (1.9.4). Docs
This fixes many issues such as: #13785 , #13464 , #11642 , #11501 , #11458 , and a few others
Unfortunately, this doesn't fix every issue as this does still need to change in the repository Stability-AI/stablediffusion
Checklist: