Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add masked loss #1207

Merged
merged 11 commits into from
Mar 26, 2024
Merged

Add masked loss #1207

merged 11 commits into from
Mar 26, 2024

Conversation

kohya-ss
Copy link
Owner

Mask the loss value with conditioning image.

@Trojaner
Copy link

Trojaner commented Mar 24, 2024

snip
(was related to deepspeed)

@Trojaner Trojaner mentioned this pull request Mar 24, 2024
@Trojaner
Copy link

I cannot use reg images at the same time when conditioning_data_dir is defined, e.g.

[[datasets]]
caption_extension = ".txt"
resolution = [512,768]
  
  [[datasets.subsets]]
  image_dir = '/home/ml/checkpoints/regularization'
  is_reg = true
  class_tokens = "something"
  num_repeats = 1  

  [[datasets.subsets]]
  image_dir = '/home/ml/checkpoints/sd15/img'
  conditioning_data_dir = '/home/ml/checkpoints/sd15/mask'
  num_repeats = 10
voluptuous.error.MultipleInvalid: extra keys not allowed @ data['datasets'][0]['subsets'][1]['conditioning_data_dir']

@kohya-ss
Copy link
Owner Author

Unfortunately ControlNetSubset (used in the masked loss) doesn't support is_reg attribute. Is it fine to set num_repeats of each subset (reg and train) to make the total number of reg images (num_repeats * num of reg images) same as the total number of training images? I think it is very similar to use is_reg for reg images.

@kohya-ss kohya-ss merged commit 5a2afb3 into dev Mar 26, 2024
2 checks passed
@kohya-ss kohya-ss deleted the masked-loss branch March 26, 2024 10:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants