Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we have an example that works with SDXL? #23

Open
Baughn opened this issue Dec 30, 2023 · 14 comments
Open

Can we have an example that works with SDXL? #23

Baughn opened this issue Dec 30, 2023 · 14 comments

Comments

@Baughn
Copy link

Baughn commented Dec 30, 2023

I still can't make it function, unfortunately. Keep getting the same error—"AssertionError: must specify y if and only if the model is class-conditional".

@ssitu
Copy link
Owner

ssitu commented Dec 30, 2023

Do you ever see the console saying “[FABRIC] Found c_adm with shape…” when you try it?

@Baughn
Copy link
Author

Baughn commented Dec 31, 2023

Not that I can see. Here's the stack trace:

Dec 31 12:54:27 saya steam-run[82688]: got prompt
Dec 31 12:54:34 saya steam-run[82688]: model_type EPS
Dec 31 12:54:34 saya steam-run[82688]: adm 2816
Dec 31 12:55:07 saya steam-run[82688]: Using xformers attention in VAE
Dec 31 12:55:07 saya steam-run[82688]: Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Dec 31 12:55:07 saya steam-run[82688]: Using xformers attention in VAE
Dec 31 12:55:18 saya steam-run[82688]: missing {'cond_stage_model.clip_l.logit_scale', 'cond_stage_model.clip_l.text_projection'}
Dec 31 12:55:18 saya steam-run[82688]: left over keys: dict_keys(['cond_stage_model.clip_l.transformer.text_model.embeddings.position_ids'])
Dec 31 12:55:18 saya steam-run[82688]: Requested to load SDXLClipModel
Dec 31 12:55:18 saya steam-run[82688]: Loading 1 new model
Dec 31 12:55:18 saya steam-run[82688]: Requested to load AutoencoderKL
Dec 31 12:55:18 saya steam-run[82688]: Loading 1 new model
Dec 31 12:55:19 saya steam-run[82688]: [FABRIC] 0 positive latents, 2 negative latents
Dec 31 12:55:19 saya steam-run[82688]: Requested to load SDXL
Dec 31 12:55:19 saya steam-run[82688]: Loading 1 new model
Dec 31 12:55:19 saya steam-run[82688]: [78B blob data]
Dec 31 12:55:19 saya steam-run[82688]: ERROR:root:!!! Exception during processing !!!
Dec 31 12:55:19 saya steam-run[82688]: ERROR:root:Traceback (most recent call last):
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/execution.py", line 154, in recursive_execute
Dec 31 12:55:19 saya steam-run[82688]:     output_data, output_ui = get_output_data(obj, input_data_all)
Dec 31 12:55:19 saya steam-run[82688]:                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/execution.py", line 84, in get_output_data
Dec 31 12:55:19 saya steam-run[82688]:     return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
Dec 31 12:55:19 saya steam-run[82688]:                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/execution.py", line 77, in map_node_over_list
Dec 31 12:55:19 saya steam-run[82688]:     results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
Dec 31 12:55:19 saya steam-run[82688]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/nodes.py", line 1333, in sample
Dec 31 12:55:19 saya steam-run[82688]:     return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/nodes.py", line 1269, in common_ksampler
Dec 31 12:55:19 saya steam-run[82688]:     samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
Dec 31 12:55:19 saya steam-run[82688]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/sample.py", line 101, in sample
Dec 31 12:55:19 saya steam-run[82688]:     samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
Dec 31 12:55:19 saya steam-run[82688]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 716, in sample
Dec 31 12:55:19 saya steam-run[82688]:     return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 622, in sample
Dec 31 12:55:19 saya steam-run[82688]:     samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
Dec 31 12:55:19 saya steam-run[82688]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 561, in sample
Dec 31 12:55:19 saya steam-run[82688]:     samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
Dec 31 12:55:19 saya steam-run[82688]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
Dec 31 12:55:19 saya steam-run[82688]:     return func(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/k_diffusion/sampling.py", line 580, in sample_dpmpp_2m
Dec 31 12:55:19 saya steam-run[82688]:     denoised = model(x, sigmas[i] * s_in, **extra_args)
Dec 31 12:55:19 saya steam-run[82688]:                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 12:55:19 saya steam-run[82688]:     return self._call_impl(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 12:55:19 saya steam-run[82688]:     return forward_call(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 285, in forward
Dec 31 12:55:19 saya steam-run[82688]:     out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
Dec 31 12:55:19 saya steam-run[82688]:           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 12:55:19 saya steam-run[82688]:     return self._call_impl(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 12:55:19 saya steam-run[82688]:     return forward_call(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 275, in forward
Dec 31 12:55:19 saya steam-run[82688]:     return self.apply_model(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 272, in apply_model
Dec 31 12:55:19 saya steam-run[82688]:     out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
Dec 31 12:55:19 saya steam-run[82688]:           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 252, in sampling_function
Dec 31 12:55:19 saya steam-run[82688]:     cond_pred, uncond_pred = calc_cond_uncond_batch(model, cond, uncond_, x, timestep, model_options)
Dec 31 12:55:19 saya steam-run[82688]:                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 224, in calc_cond_uncond_batch
Dec 31 12:55:19 saya steam-run[82688]:     output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep_, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
Dec 31 12:55:19 saya steam-run[82688]:              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/custom_nodes/ComfyUI_fabric/fabric/fabric.py", line 281, in unet_wrapper
Dec 31 12:55:19 saya steam-run[82688]:     _ = model_func(batch_latents, batch_ts, **c_null_dict)
Dec 31 12:55:19 saya steam-run[82688]:         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/model_base.py", line 85, in apply_model
Dec 31 12:55:19 saya steam-run[82688]:     model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
Dec 31 12:55:19 saya steam-run[82688]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 12:55:19 saya steam-run[82688]:     return self._call_impl(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 12:55:19 saya steam-run[82688]:     return forward_call(*args, **kwargs)
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 840, in forward
Dec 31 12:55:19 saya steam-run[82688]:     assert (y is not None) == (
Dec 31 12:55:19 saya steam-run[82688]:            ^^^^^^^^^^^^^^^^^^^^
Dec 31 12:55:19 saya steam-run[82688]: AssertionError: must specify y if and only if the model is class-conditional
Dec 31 12:55:19 saya steam-run[82688]: Prompt executed in 52.32 seconds

@ssitu
Copy link
Owner

ssitu commented Dec 31, 2023

Looks like something was renamed in comfy that I didn't notice. Hopefully it works for you after pulling my changes.

@Baughn
Copy link
Author

Baughn commented Dec 31, 2023

Well, it produces a different error now...

Dec 31 18:46:05 saya steam-run[99083]: got prompt
Dec 31 18:46:05 saya steam-run[99083]: model_type EPS
Dec 31 18:46:05 saya steam-run[99083]: adm 2816
Dec 31 18:46:05 saya steam-run[99083]: Using xformers attention in VAE
Dec 31 18:46:05 saya steam-run[99083]: Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Dec 31 18:46:05 saya steam-run[99083]: Using xformers attention in VAE
Dec 31 18:46:06 saya steam-run[99083]: missing {'cond_stage_model.clip_l.text_projection', 'cond_stage_model.clip_l.logit_scale'}
Dec 31 18:46:06 saya steam-run[99083]: left over keys: dict_keys(['cond_stage_model.clip_l.transformer.text_model.embeddings.position_ids'])
Dec 31 18:46:06 saya steam-run[99083]: Requested to load SDXLClipModel
Dec 31 18:46:06 saya steam-run[99083]: Loading 1 new model
Dec 31 18:46:06 saya steam-run[99083]: Requested to load AutoencoderKL
Dec 31 18:46:06 saya steam-run[99083]: Loading 1 new model
Dec 31 18:46:06 saya steam-run[99083]: /home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/conv.py:456: UserWarning: Applied workaround for CuDNN issue, install nvrtc.so (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:80.)
Dec 31 18:46:06 saya steam-run[99083]:   return F.conv2d(input, weight, bias, self.stride,
Dec 31 18:46:06 saya steam-run[99083]: [FABRIC] 0 positive latents, 2 negative latents
Dec 31 18:46:06 saya steam-run[99083]: Requested to load SDXL
Dec 31 18:46:06 saya steam-run[99083]: Loading 1 new model
Dec 31 18:46:07 saya steam-run[99083]: [141B blob data]
Dec 31 18:46:07 saya steam-run[99083]: [FABRIC] Latents have different sizes (input: torch.Size([2, 4, 128, 128]), pos: torch.Size([0, 4, 128, 128]), neg: torch.Size([2, 4, 95, 65])). Resizing latents to the same size as input latent. It is recommended to resize the latents beforehand in pixel space or use a model to resize the latent.
Dec 31 18:46:07 saya steam-run[99083]:   warnings.warn(
Dec 31 18:46:07 saya steam-run[99083]: [FABRIC] Found c_adm with shape torch.Size([2, 2816]).
Dec 31 18:46:07 saya steam-run[99083]: [39B blob data]
Dec 31 18:46:07 saya steam-run[99083]: ERROR:root:!!! Exception during processing !!!
Dec 31 18:46:07 saya steam-run[99083]: ERROR:root:Traceback (most recent call last):
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/execution.py", line 154, in recursive_execute
Dec 31 18:46:07 saya steam-run[99083]:     output_data, output_ui = get_output_data(obj, input_data_all)
Dec 31 18:46:07 saya steam-run[99083]:                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/execution.py", line 84, in get_output_data
Dec 31 18:46:07 saya steam-run[99083]:     return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
Dec 31 18:46:07 saya steam-run[99083]:                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/execution.py", line 77, in map_node_over_list
Dec 31 18:46:07 saya steam-run[99083]:     results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
Dec 31 18:46:07 saya steam-run[99083]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/nodes.py", line 1333, in sample
Dec 31 18:46:07 saya steam-run[99083]:     return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/nodes.py", line 1269, in common_ksampler
Dec 31 18:46:07 saya steam-run[99083]:     samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
Dec 31 18:46:07 saya steam-run[99083]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/sample.py", line 101, in sample
Dec 31 18:46:07 saya steam-run[99083]:     samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
Dec 31 18:46:07 saya steam-run[99083]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 716, in sample
Dec 31 18:46:07 saya steam-run[99083]:     return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 622, in sample
Dec 31 18:46:07 saya steam-run[99083]:     samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
Dec 31 18:46:07 saya steam-run[99083]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 561, in sample
Dec 31 18:46:07 saya steam-run[99083]:     samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
Dec 31 18:46:07 saya steam-run[99083]:               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
Dec 31 18:46:07 saya steam-run[99083]:     return func(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/k_diffusion/sampling.py", line 580, in sample_dpmpp_2m
Dec 31 18:46:07 saya steam-run[99083]:     denoised = model(x, sigmas[i] * s_in, **extra_args)
Dec 31 18:46:07 saya steam-run[99083]:                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return self._call_impl(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return forward_call(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 285, in forward
Dec 31 18:46:07 saya steam-run[99083]:     out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
Dec 31 18:46:07 saya steam-run[99083]:           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return self._call_impl(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return forward_call(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 275, in forward
Dec 31 18:46:07 saya steam-run[99083]:     return self.apply_model(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 272, in apply_model
Dec 31 18:46:07 saya steam-run[99083]:     out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
Dec 31 18:46:07 saya steam-run[99083]:           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 252, in sampling_function
Dec 31 18:46:07 saya steam-run[99083]:     cond_pred, uncond_pred = calc_cond_uncond_batch(model, cond, uncond_, x, timestep, model_options)
Dec 31 18:46:07 saya steam-run[99083]:                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/samplers.py", line 224, in calc_cond_uncond_batch
Dec 31 18:46:07 saya steam-run[99083]:     output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep_, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
Dec 31 18:46:07 saya steam-run[99083]:              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/custom_nodes/ComfyUI_fabric/fabric/fabric.py", line 291, in unet_wrapper
Dec 31 18:46:07 saya steam-run[99083]:     out = model_func(input, sigma, **c)
Dec 31 18:46:07 saya steam-run[99083]:           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/model_base.py", line 85, in apply_model
Dec 31 18:46:07 saya steam-run[99083]:     model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
Dec 31 18:46:07 saya steam-run[99083]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return self._call_impl(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return forward_call(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 854, in forward
Dec 31 18:46:07 saya steam-run[99083]:     h = forward_timestep_embed(module, h, emb, context, transformer_options, time_context=time_context, num_video_frames=num_video_frames, image_only_indicator=image_only_indicator)
Dec 31 18:46:07 saya steam-run[99083]:         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/ldm/modules/diffusionmodules/openaimodel.py", line 46, in forward_timestep_embed
Dec 31 18:46:07 saya steam-run[99083]:     x = layer(x, context, transformer_options)
Dec 31 18:46:07 saya steam-run[99083]:         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return self._call_impl(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return forward_call(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/ldm/modules/attention.py", line 604, in forward
Dec 31 18:46:07 saya steam-run[99083]:     x = block(x, context=context[i], transformer_options=transformer_options)
Dec 31 18:46:07 saya steam-run[99083]:         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return self._call_impl(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
Dec 31 18:46:07 saya steam-run[99083]:     return forward_call(*args, **kwargs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/ldm/modules/attention.py", line 431, in forward
Dec 31 18:46:07 saya steam-run[99083]:     return checkpoint(self._forward, (x, context, transformer_options), self.parameters(), self.checkpoint)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/ldm/modules/diffusionmodules/util.py", line 189, in checkpoint
Dec 31 18:46:07 saya steam-run[99083]:     return func(*inputs)
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/comfy/ldm/modules/attention.py", line 470, in _forward
Dec 31 18:46:07 saya steam-run[99083]:     n, context_attn1, value_attn1 = p(n, context_attn1, value_attn1, extra_options)
Dec 31 18:46:07 saya steam-run[99083]:                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]:   File "/home/svein/AI/image-generation/ComfyUI/custom_nodes/ComfyUI_fabric/fabric/fabric.py", line 140, in modified_attn1
Dec 31 18:46:07 saya steam-run[99083]:     assert neg_hs.shape[0] == num_neg, f"neg_hs batch size ({neg_hs.shape[0]}) != number of neg_latents ({num_neg})"
Dec 31 18:46:07 saya steam-run[99083]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 31 18:46:07 saya steam-run[99083]: AssertionError: neg_hs batch size (4) != number of neg_latents (2)

@ssitu
Copy link
Owner

ssitu commented Dec 31, 2023

Could I see a workflow that this happens with?

@Baughn
Copy link
Author

Baughn commented Dec 31, 2023

fabric-workflow.json

Here you go.

@ssitu
Copy link
Owner

ssitu commented Dec 31, 2023

Does this happen if you use an sd 1.5 model?

@Baughn
Copy link
Author

Baughn commented Dec 31, 2023

No, those work fine.

@ngrunwald
Copy link

I have the same error with SDXL I'm afraid.

@Trung0246
Copy link

Trung0246 commented Feb 11, 2024

I got another stack trace when attempting to run the demo workflow with SDXL (direct ksampler one):

Requested to load AutoencoderKL
Loading 1 new model
Requested to load SDXLClipModel
Loading 1 new model
[FABRIC] 1 positive latents, 1 negative latents
Requested to load SDXL
Loading 1 new model
  0%|                                                                                                                                                                         | 0/10 [00:00<?, ?it/s][FABRIC] Found c_adm with shape torch.Size([4, 2816]).
  0%|                                                                                                                                                                         | 0/10 [00:00<?, ?it/s] 
2024-02-10 18:54:01,321 - root - ERROR - !!! Exception during processing !!!
2024-02-10 18:54:01,333 - root - ERROR - Traceback (most recent call last):
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\execution.py", line 545, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\execution.py", line 309, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI-0246\utils.py", line 375, in new_func
    res_value = old_func(*final_args, **kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\execution.py", line 242, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 181, in sample
    return KSamplerFABRICAdv().sample(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI_fabric\nodes.py", line 138, in sample
    return fabric_sample(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 52, in fabric_sample
    samples = KSamplerAdvanced().sample(model_patched, add_noise, noise_seed, steps, cfg, sampler_name, scheduler, positive,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\nodes.py", line 1389, in sample
    return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\nodes.py", line 1325, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
    return original_sample(*args, **kwargs)  # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 241, in motion_sample
    return orig_comfy_sample(model, noise, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\sample.py", line 100, in sample
    samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 130, in KSampler_sample
    return _KSampler_sample(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 716, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\__init__.py", line 149, in sample
    return _sample(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 622, in sample
    samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 561, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\k_diffusion\sampling.py", line 580, in sample_dpmpp_2m
    denoised = model(x, sigmas[i] * s_in, **extra_args)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 285, in forward
    out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 275, in forward
    return self.apply_model(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1030, in apply_model
    out = super().apply_model(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 272, in apply_model
    out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 252, in sampling_function
    cond_pred, uncond_pred = calc_cond_uncond_batch(model, cond, uncond_, x, timestep, model_options)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\samplers.py", line 224, in calc_cond_uncond_batch
    output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep_, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\ComfyUI_fabric\fabric\fabric.py", line 281, in unet_wrapper
    _ = model_func(batch_latents, batch_ts, **c_null_dict)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\comfy\model_base.py", line 86, in apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Program Files (Sole)\ComfyUI\ComfyUI\custom_nodes\FreeU_Advanced\nodes.py", line 170, in __temp__forward
    assert y.shape[0] == x.shape[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError

@ssitu
Copy link
Owner

ssitu commented Feb 21, 2024

Yeah, I'm not sure what the problem is, and it is difficult to figure it out when I can't run SDXL myself.

@wibur0620
Copy link

It seems to support sd1.5.

@wibur0620
Copy link

It doesn't seem to support sdxl

@revolvedai
Copy link

Yeah, I'm not sure what the problem is, and it is difficult to figure it out when I can't run SDXL myself.

Can I donate some cloud time for you to test and develop on SDXL? plz reach out ed@rundiffusion.com and we would be happy to support you with some server time. SD Ultimate Upscale Node and Fabric node are wonderful tools!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants