Add DoRA (weight-decompose) support for LoRA/LoHa/LoKr#15160
Add DoRA (weight-decompose) support for LoRA/LoHa/LoKr#15160AUTOMATIC1111 merged 1 commit intodevfrom
Conversation
|
@KohakuBlueleaf I don't know if it's for every Lora but with this PR I've constantly been getting warnings/errors like this.
|
Thx for this info |
|
Also seeing it: |
|
Until now I assumed this was more a warning but any Lora that shows this error is just discarded and not used during generation. (doesn't get listed under "Lora hashes:" in the generation output) |
|
someone tested dora in this code? I got this error merged_scale1 / merged_scale1(dim=self.dora_mean_dim, keepdim=True) * self.dora_scale TypeError: 'Tensor' object is not callable |
|
merged_scale1 = updown + orig_weight merged_scale1 type is a tensor that is not callable |
I said |
Description
Implementation for inference with DoRA
The key name is based on the implementation in LyCORIS.
And since weight-decompose is a general idea on top of all low-rank method. I implement it in the NetworkModule instead of each algo's module.
And here is a quick sanity check:

Checklist: