Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(nodes): skip on duplicate loras instead of erroring #6685

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

psychedelicious
Copy link
Collaborator

Summary

The LoRA and SDXL LoRA nodes would error if it duplicated LoRAs. To make the nodes more resilient, they now skip dupes and log a warning to the console instead.

Also added a warning for the LoRA Collection Loader nodes. These already skipped but didn't log a warning.

Related Issues / Discussions

https://discord.com/channels/1020123559063990373/1130288930319761428/1266235074647691335

QA Instructions

This workflow has 4 test flows, each with 3 LoRA loaders:

  • SDXL LoRA chain
  • SDXL LoRA collection
  • SD1.5 LoRA chain
  • SD1.5 LoRA collection

For each flow, set two of the loaders to the same LoRA and the 3rd to a different. The output for each flow should have only two LoRAs, and you should have warnings in the console for each.

If you test this on main, we expect errors for both the chain flows.

lora test(1).json

Merge Plan

n/a

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • Documentation added / updated (if applicable)

The `LoRA` and `SDXL LoRA` nodes would error if it duplicated LoRAs. To make the nodes more resilient, they now skip dupes and log a warning to the console instead.

Also added a warning for the LoRA Collection Loader nodes. These already skipped but didn't log a warning.
@github-actions github-actions bot added python PRs that change python files invocations PRs that change invocations labels Jul 26, 2024
@psychedelicious
Copy link
Collaborator Author

Discussed with @StAlKeR7779 - in some cases, skipping duplicates may not be correct - for example, we may want to let the last instance of a LoRA be used, or sum the weights.

Maybe we add an enum field - on_conflict or duplicate_handling, something like that - to describe how to handle duplicate LoRAs. Valid options could be 'skip' | 'replace' | 'error' | 'sum_weights'... 'max' | 'min' | 'average'... Probably overkill.

Also, 'skip' and 'replace' are ambiguous for the collection loader, because we don't know the order in which the collect node does its collecting. It's not clear which would be taken of duplicates provided to the collect.

@RyanJDick
Copy link
Collaborator

Of all the possible duplicate handling methods, error still makes the most sense to me as the default.

Maybe we start with min, max, error? That would handle most cases I can think of.

@psychedelicious
Copy link
Collaborator Author

Some context from the problem report. They were using this node:

image

It outputs unet and clip fields with the loras baked in.

@StAlKeR7779
Copy link
Contributor

StAlKeR7779 commented Jul 28, 2024

I feel that most cases will be closed by:

  • error - error if lora applied before
  • skip - use already applied weight
  • replace/override - use new lora weight
  • sum/merge - sum old and new weight

For example case with metadata node should be closed by replace/override option, as it allow to select weight of details lora independent of what used for original image generation.
For default logic I feel that should be sum/merge as it do exactly what applying lora means - patch what provided, or error to notify user about double use of lora.
About skip - I not sure in which cases it should fit.

@psychedelicious psychedelicious marked this pull request as draft August 10, 2024 23:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
invocations PRs that change invocations python PRs that change python files
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants