Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PEFT] set_adapters with weights argument currently broken #5414

Closed
younesbelkada opened this issue Oct 16, 2023 · 0 comments · Fixed by huggingface/peft#1029 or #5417
Closed

[PEFT] set_adapters with weights argument currently broken #5414

younesbelkada opened this issue Oct 16, 2023 · 0 comments · Fixed by huggingface/peft#1029 or #5417
Labels
bug Something isn't working

Comments

@younesbelkada
Copy link
Contributor

Describe the bug

Currently set_adapters with weights argument is broken in diffusers. In fact set_adapters will correctly call module.scale_layer that currently overrides the scale value with the scaled value. But during inference, that information is being lost because we re-compute the original scale here: https://github.com/huggingface/peft/blob/45565f4357e24177020b12f43373c962497d82a2/src/peft/tuners/lora/layer.py#L163

We plan to fix it on PEFT side huggingface/peft#1028 but requires to upstream the changes on diffusers

Reproduction

Notebook shared in #5395

Logs

No response

System Info

xxx

Who can help?

@younesbelkada @sayakpaul @pacman100

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
2 participants