You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Those messages are superfluous or are warnings about some optimizations not taking effect with IPEX. You should still get an output that is coherent so that shouldn't be a worry.
Describe the issue
I used Comfyui 0.3.14 make a pic with lumina 2.0 model, got following bug message:
Starting server
To see the GUI go to: http://127.0.0.1:8188
FETCH DATA from: C:\Comfyui0.3\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json [DONE]
got prompt
model weight dtype torch.bfloat16, manual cast: None
model_type FLOW
Using pytorch attention in VAE
Using pytorch attention in VAE
VAE load device: xpu:0, offload device: cpu, dtype: torch.bfloat16
Requested to load LuminaTEModel_
loaded completely 9.5367431640625e+25 4986.46142578125 True
2025-02-08 23:28:30,319 - _logger.py - IPEX - INFO - Currently split master weight for xpu only support sgd
2025-02-08 23:28:30,329 - _logger.py - IPEX - INFO - Conv BatchNorm folding failed during the optimize process.
2025-02-08 23:28:30,336 - _logger.py - IPEX - INFO - Linear BatchNorm folding failed during the optimize process.
2025-02-08 23:28:30,336 - _logger.py - IPEX - WARNING - [NotSupported]failed to apply concat_linear on unet, please report bugs
CLIP/text encoder model load device: xpu:0, offload device: cpu, current: xpu:0, dtype: torch.float16
2025-02-08 23:28:41,599 - _logger.py - IPEX - INFO - Currently split master weight for xpu only support sgd
2025-02-08 23:28:41,607 - _logger.py - IPEX - INFO - Conv BatchNorm folding failed during the optimize process.
2025-02-08 23:28:41,614 - _logger.py - IPEX - INFO - Linear BatchNorm folding failed during the optimize process.
2025-02-08 23:28:41,614 - _logger.py - IPEX - WARNING - [NotSupported]failed to apply concat_linear on unet, please report bugs
2025-02-08 23:28:42,841 - _logger.py - IPEX - INFO - Currently split master weight for xpu only support sgd
2025-02-08 23:28:42,848 - _logger.py - IPEX - INFO - Conv BatchNorm folding failed during the optimize process.
2025-02-08 23:28:42,855 - _logger.py - IPEX - INFO - Linear BatchNorm folding failed during the optimize process.
2025-02-08 23:28:42,856 - _logger.py - IPEX - WARNING - [NotSupported]failed to apply concat_linear on unet, please report bugs
Requested to load Lumina2
loaded completely 9524.21650390625 4977.7440185546875 True
2025-02-08 23:28:44,421 - _logger.py - IPEX - INFO - Currently split master weight for xpu only support sgd
2025-02-08 23:28:44,427 - _logger.py - IPEX - INFO - Conv BatchNorm folding failed during the optimize process.
2025-02-08 23:28:44,433 - _logger.py - IPEX - INFO - Linear BatchNorm folding failed during the optimize process.
2025-02-08 23:28:44,434 - _logger.py - IPEX - WARNING - [NotSupported]failed to apply concat_linear on unet, please report bugs
100%|██████████████████████████████████████████████████████████████████████████████████| 25/25 [06:08<00:00, 14.72s/it]
Requested to load AutoencodingEngine
loaded completely 2385.8601562500003 159.87335777282715 True
2025-02-08 23:34:52,809 - _logger.py - IPEX - INFO - Currently split master weight for xpu only support sgd
2025-02-08 23:34:52,825 - _logger.py - IPEX - INFO - Conv BatchNorm folding failed during the optimize process.
2025-02-08 23:34:52,832 - _logger.py - IPEX - INFO - Linear BatchNorm folding failed during the optimize process.
2025-02-08 23:34:52,833 - _logger.py - IPEX - WARNING - [NotSupported]failed to apply concat_linear on unet, please report bugs
Prompt executed in 385.98 seconds
lumina 2.0 is here:
https://github.com/[Alpha-VLLM/Lumina-Image-2.0](https://github.com/Alpha-VLLM/Lumina-Image-2.0)
The text was updated successfully, but these errors were encountered: