-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add DoRA (weight-decompose) support for LoRA/LoHa/LoKr #15160
Conversation
@KohakuBlueleaf I don't know if it's for every Lora but with this PR I've constantly been getting warnings/errors like this.
|
Thx for this info |
Also seeing it:
|
Until now I assumed this was more a warning but any Lora that shows this error is just discarded and not used during generation. (doesn't get listed under "Lora hashes:" in the generation output) |
someone tested dora in this code? I got this error merged_scale1 / merged_scale1(dim=self.dora_mean_dim, keepdim=True) * self.dora_scale TypeError: 'Tensor' object is not callable |
merged_scale1 = updown + orig_weight merged_scale1 type is a tensor that is not callable |
I said |
Description
Implementation for inference with DoRA
The key name is based on the implementation in LyCORIS.
And since weight-decompose is a general idea on top of all low-rank method. I implement it in the NetworkModule instead of each algo's module.
And here is a quick sanity check:
Checklist: