Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make __getattr__ on Module return object, not Any #257

Merged
merged 2 commits into from
Feb 6, 2024

Conversation

catwell
Copy link
Member

@catwell catwell commented Feb 6, 2024

PyTorch chose to make it Any because they expect its users' code to be "highly dynamic": pytorch/pytorch#104321 It is not the case for us, in Refiners having untyped code goes contrary to one of our core principles.

Note that there is currently an open PR in PyTorch to return Module | Tensor, but in practice this is not always correct either: pytorch/pytorch#115074

I also moved Residuals-related code from SD1 to latent_diffusion because SDXL should not depend on SD1 and made Lora generic.

@catwell catwell requested a review from deltheil February 6, 2024 10:13
PyTorch chose to make it Any because they expect its users' code
to be "highly dynamic": pytorch/pytorch#104321

It is not the case for us, in Refiners having untyped code
goes contrary to one of our core principles.

Note that there is currently an open PR in PyTorch to
return `Module | Tensor`, but in practice this is not always
correct either: pytorch/pytorch#115074

I also moved Residuals-related code from SD1 to latent_diffusion
because SDXL should not depend on SD1.
@catwell catwell merged commit 37425fb into main Feb 6, 2024
1 check passed
@catwell catwell deleted the pr/getattr-returns-object branch February 9, 2024 08:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants