Skip to content

feat: Support LoKr/LoHa for SDXL and Anima#2275

Merged
kohya-ss merged 14 commits intosd3from
feat-support-lokr-loha
Feb 23, 2026
Merged

feat: Support LoKr/LoHa for SDXL and Anima#2275
kohya-ss merged 14 commits intosd3from
feat-support-lokr-loha

Conversation

@kohya-ss
Copy link
Copy Markdown
Owner

This pull request introduces significant improvements to LoRA (Low-Rank Adaptation) support in the codebase, particularly enhancing the merging logic to better handle different LoRA variants, and reintroducing the LoRAModule and LoRAInfModule classes in lora_anima.py. The main focus is on increasing compatibility with LoHa and LoKr LoRA types, and providing a full, self-contained implementation of LoRA modules for both training and inference.

LoRA merging and compatibility improvements:

  • Enhanced the weight_hook_func in lora_utils.py to check for and merge LoHa and LoKr weights in addition to standard LoRA, increasing compatibility with more LoRA variants. [1] [2]

LoRA module implementation:

  • Re-added and fully implemented LoRAModule and LoRAInfModule classes in lora_anima.py, providing a self-contained LoRA module for both training and inference, including dropout, rank dropout, module dropout, and merging logic for various layer types (Linear and Conv2d).

These changes make the codebase more flexible and robust when working with different LoRA networks, and provide a clear, maintainable implementation of LoRA modules.

kohya-ss and others added 10 commits February 15, 2026 21:50
- networks/network_base.py: shared AdditionalNetwork base class with architecture auto-detection (SDXL/Anima) and generic module injection
- networks/loha.py: LoHa (Low-rank Hadamard Product) module with HadaWeight custom autograd, training/inference classes, and factory functions
- networks/lokr.py: LoKr (Low-rank Kronecker Product) module with factorization, training/inference classes, and factory functions
- library/lora_utils.py: extend weight merge hook to detect and merge LoHa/LoKr weights alongside standard LoRA

Linear and Conv2d 1x1 layers only; Conv2d 3x3 (Tucker decomposition) support will be added separately.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Added Tucker decomposition functionality to LoHa and LoKr modules.
- Implemented new methods for weight rebuilding using Tucker decomposition.
- Updated initialization and weight handling for Conv2d 3x3+ layers.
- Modified get_diff_weight methods to accommodate Tucker and non-Tucker modes.
- Enhanced network base to include unet_conv_target_modules for architecture detection.
@kohya-ss kohya-ss marked this pull request as ready for review February 23, 2026 09:44
@kohya-ss kohya-ss requested a review from Copilot February 23, 2026 09:44
@kohya-ss kohya-ss changed the title Feat support lokr loha feat: Support LoKr/LoHa for SDXL and Anima Feb 23, 2026
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR expands the codebase’s LoRA ecosystem by adding LyCORIS-family variants (LoHa/LoKr), introducing a shared AdditionalNetwork base with architecture auto-detection, and extending safetensors loading to merge LoHa/LoKr weights in addition to standard LoRA.

Changes:

  • Added a shared networks/network_base.py with architecture detection and a generic AdditionalNetwork used by LyCORIS-style modules.
  • Introduced new networks/loha.py and networks/lokr.py implementations (train + inference + direct weight merge helpers).
  • Updated library/lora_utils.py to merge LoHa/LoKr weights during safetensors loading; added documentation for LoHa/LoKr usage.

Reviewed changes

Copilot reviewed 7 out of 8 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
networks/network_base.py New shared base (AdditionalNetwork) + architecture detection + optimizer param grouping support for LoHa/LoKr-like modules.
networks/lora_anima.py Reintroduces self-contained LoRAModule/LoRAInfModule in the Anima LoRA implementation (removes dependency on lora_flux).
networks/lokr.py Adds LoKr module implementation (train/inference) and direct tensor-merge helper.
networks/loha.py Adds LoHa module implementation (train/inference) and direct tensor-merge helper.
library/lora_utils.py Extends safetensors load-time merge hook to support LoHa/LoKr in addition to standard LoRA.
docs/loha_lokr.md New documentation describing LoHa/LoKr support, CLI usage, and options.
.gitignore Ignores a references directory.
.ai/context/01-overview.md Updates internal context docs to list additional supported model families.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

kohya-ss and others added 4 commits February 23, 2026 21:07
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@kohya-ss kohya-ss merged commit 2217704 into sd3 Feb 23, 2026
4 of 5 checks passed
@kohya-ss kohya-ss deleted the feat-support-lokr-loha branch February 23, 2026 13:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants