make __getattr__
on Module return object, not Any
#257
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
PyTorch chose to make it Any because they expect its users' code to be "highly dynamic": pytorch/pytorch#104321 It is not the case for us, in Refiners having untyped code goes contrary to one of our core principles.
Note that there is currently an open PR in PyTorch to return
Module | Tensor
, but in practice this is not always correct either: pytorch/pytorch#115074I also moved Residuals-related code from SD1 to latent_diffusion because SDXL should not depend on SD1 and made Lora generic.