You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An error occurred while loading 'pulid_flux_v0.9.0.safetensors' using the 'Load PuLID Model' node. The following is the error log, can you provide any suggestions?
An error occurred while loading 'pulid_flux_v0.9.0.safetensors' using the 'Load PuLID Model' node. The following is the error log, can you provide any suggestions?
workflow
error.json
logs
got prompt
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: /home/vipuser/ComfyUI/models/insightface/models/antelopev2/1k3d68.onnx landmark_3d_68 ['None', 3, 192, 192] 0.0 1.0
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: /home/vipuser/ComfyUI/models/insightface/models/antelopev2/2d106det.onnx landmark_2d_106 ['None', 3, 192, 192] 0.0 1.0
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: /home/vipuser/ComfyUI/models/insightface/models/antelopev2/genderage.onnx genderage ['None', 3, 96, 96] 0.0 1.0
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: /home/vipuser/ComfyUI/models/insightface/models/antelopev2/glintr100.onnx recognition ['None', 3, 112, 112] 127.5 127.5
Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}
find model: /home/vipuser/ComfyUI/models/insightface/models/antelopev2/scrfd_10g_bnkps.onnx detection [1, 3, '?', '?'] 127.5 128.0
set det-size: (640, 640)
!!! Exception during processing !!! Error(s) in loading state_dict for IDEncoder:
Missing key(s) in state_dict: "body.0.weight", "body.0.bias", "body.1.weight", "body.1.bias", "body.3.weight", "body.3.bias", "body.4.weight", "body.4.bias", "body.6.weight", "body.6.bias", "mapping_0.0.weight", "mapping_0.0.bias", "mapping_0.1.weight", "mapping_0.1.bias", "mapping_0.3.weight", "mapping_0.3.bias", "mapping_0.4.weight", "mapping_0.4.bias", "mapping_0.6.weight", "mapping_0.6.bias", "mapping_patch_0.0.weight", "mapping_patch_0.0.bias", "mapping_patch_0.1.weight", "mapping_patch_0.1.bias", "mapping_patch_0.3.weight", "mapping_patch_0.3.bias", "mapping_patch_0.4.weight", "mapping_patch_0.4.bias", "mapping_patch_0.6.weight", "mapping_patch_0.6.bias", "mapping_1.0.weight", "mapping_1.0.bias", "mapping_1.1.weight", "mapping_1.1.bias", "mapping_1.3.weight", "mapping_1.3.bias", "mapping_1.4.weight", "mapping_1.4.bias", "mapping_1.6.weight", "mapping_1.6.bias", "mapping_patch_1.0.weight", "mapping_patch_1.0.bias", "mapping_patch_1.1.weight", "mapping_patch_1.1.bias", "mapping_patch_1.3.weight", "mapping_patch_1.3.bias", "mapping_patch_1.4.weight", "mapping_patch_1.4.bias", "mapping_patch_1.6.weight", "mapping_patch_1.6.bias", "mapping_2.0.weight", "mapping_2.0.bias", "mapping_2.1.weight", "mapping_2.1.bias", "mapping_2.3.weight", "mapping_2.3.bias", "mapping_2.4.weight", "mapping_2.4.bias", "mapping_2.6.weight", "mapping_2.6.bias", "mapping_patch_2.0.weight", "mapping_patch_2.0.bias", "mapping_patch_2.1.weight", "mapping_patch_2.1.bias", "mapping_patch_2.3.weight", "mapping_patch_2.3.bias", "mapping_patch_2.4.weight", "mapping_patch_2.4.bias", "mapping_patch_2.6.weight", "mapping_patch_2.6.bias", "mapping_3.0.weight", "mapping_3.0.bias", "mapping_3.1.weight", "mapping_3.1.bias", "mapping_3.3.weight", "mapping_3.3.bias", "mapping_3.4.weight", "mapping_3.4.bias", "mapping_3.6.weight", "mapping_3.6.bias", "mapping_patch_3.0.weight", "mapping_patch_3.0.bias", "mapping_patch_3.1.weight", "mapping_patch_3.1.bias", "mapping_patch_3.3.weight", "mapping_patch_3.3.bias", "mapping_patch_3.4.weight", "mapping_patch_3.4.bias", "mapping_patch_3.6.weight", "mapping_patch_3.6.bias", "mapping_4.0.weight", "mapping_4.0.bias", "mapping_4.1.weight", "mapping_4.1.bias", "mapping_4.3.weight", "mapping_4.3.bias", "mapping_4.4.weight", "mapping_4.4.bias", "mapping_4.6.weight", "mapping_4.6.bias", "mapping_patch_4.0.weight", "mapping_patch_4.0.bias", "mapping_patch_4.1.weight", "mapping_patch_4.1.bias", "mapping_patch_4.3.weight", "mapping_patch_4.3.bias", "mapping_patch_4.4.weight", "mapping_patch_4.4.bias", "mapping_patch_4.6.weight", "mapping_patch_4.6.bias".
Traceback (most recent call last):
File "/home/vipuser/ComfyUI/execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "/home/vipuser/ComfyUI/execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
File "/home/vipuser/ComfyUI/execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "/home/vipuser/ComfyUI/execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "/home/vipuser/ComfyUI/custom_nodes/PuLID_ComfyUI/pulid.py", line 222, in load_model
model = PulidModel(model)
File "/home/vipuser/ComfyUI/custom_nodes/PuLID_ComfyUI/pulid.py", line 33, in init
self.image_proj_model.load_state_dict(model["image_proj"])
File "/home/vipuser/anaconda3/lib/python3.10/site-packages/torch/nn/modules/module.py", line 2215, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for IDEncoder:
Missing key(s) in state_dict: "body.0.weight", "body.0.bias", "body.1.weight", "body.1.bias", "body.3.weight", "body.3.bias", "body.4.weight", "body.4.bias", "body.6.weight", "body.6.bias", "mapping_0.0.weight", "mapping_0.0.bias", "mapping_0.1.weight", "mapping_0.1.bias", "mapping_0.3.weight", "mapping_0.3.bias", "mapping_0.4.weight", "mapping_0.4.bias", "mapping_0.6.weight", "mapping_0.6.bias", "mapping_patch_0.0.weight", "mapping_patch_0.0.bias", "mapping_patch_0.1.weight", "mapping_patch_0.1.bias", "mapping_patch_0.3.weight", "mapping_patch_0.3.bias", "mapping_patch_0.4.weight", "mapping_patch_0.4.bias", "mapping_patch_0.6.weight", "mapping_patch_0.6.bias", "mapping_1.0.weight", "mapping_1.0.bias", "mapping_1.1.weight", "mapping_1.1.bias", "mapping_1.3.weight", "mapping_1.3.bias", "mapping_1.4.weight", "mapping_1.4.bias", "mapping_1.6.weight", "mapping_1.6.bias", "mapping_patch_1.0.weight", "mapping_patch_1.0.bias", "mapping_patch_1.1.weight", "mapping_patch_1.1.bias", "mapping_patch_1.3.weight", "mapping_patch_1.3.bias", "mapping_patch_1.4.weight", "mapping_patch_1.4.bias", "mapping_patch_1.6.weight", "mapping_patch_1.6.bias", "mapping_2.0.weight", "mapping_2.0.bias", "mapping_2.1.weight", "mapping_2.1.bias", "mapping_2.3.weight", "mapping_2.3.bias", "mapping_2.4.weight", "mapping_2.4.bias", "mapping_2.6.weight", "mapping_2.6.bias", "mapping_patch_2.0.weight", "mapping_patch_2.0.bias", "mapping_patch_2.1.weight", "mapping_patch_2.1.bias", "mapping_patch_2.3.weight", "mapping_patch_2.3.bias", "mapping_patch_2.4.weight", "mapping_patch_2.4.bias", "mapping_patch_2.6.weight", "mapping_patch_2.6.bias", "mapping_3.0.weight", "mapping_3.0.bias", "mapping_3.1.weight", "mapping_3.1.bias", "mapping_3.3.weight", "mapping_3.3.bias", "mapping_3.4.weight", "mapping_3.4.bias", "mapping_3.6.weight", "mapping_3.6.bias", "mapping_patch_3.0.weight", "mapping_patch_3.0.bias", "mapping_patch_3.1.weight", "mapping_patch_3.1.bias", "mapping_patch_3.3.weight", "mapping_patch_3.3.bias", "mapping_patch_3.4.weight", "mapping_patch_3.4.bias", "mapping_patch_3.6.weight", "mapping_patch_3.6.bias", "mapping_4.0.weight", "mapping_4.0.bias", "mapping_4.1.weight", "mapping_4.1.bias", "mapping_4.3.weight", "mapping_4.3.bias", "mapping_4.4.weight", "mapping_4.4.bias", "mapping_4.6.weight", "mapping_4.6.bias", "mapping_patch_4.0.weight", "mapping_patch_4.0.bias", "mapping_patch_4.1.weight", "mapping_patch_4.1.bias", "mapping_patch_4.3.weight", "mapping_patch_4.3.bias", "mapping_patch_4.4.weight", "mapping_patch_4.4.bias", "mapping_patch_4.6.weight", "mapping_patch_4.6.bias".
Prompt executed in 4.24 seconds
The text was updated successfully, but these errors were encountered: