We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PEFT
set_adapters
weights
Currently set_adapters with weights argument is broken in diffusers. In fact set_adapters will correctly call module.scale_layer that currently overrides the scale value with the scaled value. But during inference, that information is being lost because we re-compute the original scale here: https://github.com/huggingface/peft/blob/45565f4357e24177020b12f43373c962497d82a2/src/peft/tuners/lora/layer.py#L163
module.scale_layer
We plan to fix it on PEFT side huggingface/peft#1028 but requires to upstream the changes on diffusers
Notebook shared in #5395
No response
xxx
@younesbelkada @sayakpaul @pacman100
The text was updated successfully, but these errors were encountered:
LoRA
Successfully merging a pull request may close this issue.
Describe the bug
Currently
set_adapters
withweights
argument is broken in diffusers. In factset_adapters
will correctly callmodule.scale_layer
that currently overrides the scale value with the scaled value. But during inference, that information is being lost because we re-compute the original scale here: https://github.com/huggingface/peft/blob/45565f4357e24177020b12f43373c962497d82a2/src/peft/tuners/lora/layer.py#L163We plan to fix it on PEFT side huggingface/peft#1028 but requires to upstream the changes on diffusers
Reproduction
Notebook shared in #5395
Logs
No response
System Info
xxx
Who can help?
@younesbelkada @sayakpaul @pacman100
The text was updated successfully, but these errors were encountered: