Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

24G generates 512*512 out of memory #16

Open
DamienCz opened this issue Nov 4, 2024 · 3 comments
Open

24G generates 512*512 out of memory #16

DamienCz opened this issue Nov 4, 2024 · 3 comments

Comments

@DamienCz
Copy link

DamienCz commented Nov 4, 2024

I changed three comfyui and got the same result. I even downloaded a brand new comfyui and tried to use it but it was still out of memory. I really don't know why...

ComfyUI Error Report

Error Details

  • Node Type: OmniGenNode
  • Exception Type: torch.cuda.OutOfMemoryError
  • Exception Message: Allocation on device

Stack Trace

  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 317, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 192, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)

  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))

  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\__init__.py", line 108, in gen
    output = pipe(

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\pipeline.py", line 280, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)

  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\scheduler.py", line 163, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\model.py", line 387, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)

  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 790, in forward
    attn_outputs, self_attn_weights, present_key_value = self.self_attn(

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 463, in forward
    attn_weights = nn.functional.softmax(attn_weights, dim=-1, dtype=torch.float32).to(value_states.dtype)

  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\functional.py", line 1887, in softmax
    ret = input.softmax(dim, dtype=dtype)

System Information

  • ComfyUI Version: v0.1.3-22-gd4aeefc2
  • Arguments: E:\cf-test\ComfyUI-aki-v1.4\main.py --auto-launch --preview-method auto --disable-cuda-malloc
  • OS: nt
  • Python Version: 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
  • Embedded Python: false
  • PyTorch Version: 2.3.1+cu121

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25756696576
    • VRAM Free: 12429030210
    • Torch VRAM Total: 13891534848
    • Torch VRAM Free: 2314466114

Logs

2024-11-05 03:25:44,255 - root - INFO - Total VRAM 24564 MB, total RAM 130885 MB
2024-11-05 03:25:44,256 - root - INFO - pytorch version: 2.3.1+cu121
2024-11-05 03:25:45,034 - root - INFO - xformers version: 0.0.27
2024-11-05 03:25:45,034 - root - INFO - Set vram state to: NORMAL_VRAM
2024-11-05 03:25:45,034 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2024-11-05 03:25:45,156 - root - INFO - Using xformers cross attention
2024-11-05 03:25:45,912 - root - INFO - [Prompt Server] web root: E:\cf-test\ComfyUI-aki-v1.4\web
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path checkpoints E:\webui\sd-webui-aki-v4.8\models/Stable-diffusion
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path configs E:\webui\sd-webui-aki-v4.8\models/Stable-diffusion
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path vae E:\webui\sd-webui-aki-v4.8\models/VAE
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path loras E:\webui\sd-webui-aki-v4.8\models/Lora
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path loras E:\webui\sd-webui-aki-v4.8\models/LyCORIS
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path upscale_models E:\webui\sd-webui-aki-v4.8\models/ESRGAN
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path upscale_models E:\webui\sd-webui-aki-v4.8\models/RealESRGAN
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path upscale_models E:\webui\sd-webui-aki-v4.8\models/SwinIR
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path embeddings E:\webui\sd-webui-aki-v4.8\embeddings
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path hypernetworks E:\webui\sd-webui-aki-v4.8\models/hypernetworks
2024-11-05 03:25:45,913 - root - INFO - Adding extra search path controlnet E:\extensions\sd-webui-controlnet\models
2024-11-05 03:25:47,858 - root - INFO - Total VRAM 24564 MB, total RAM 130885 MB
2024-11-05 03:25:47,858 - root - INFO - pytorch version: 2.3.1+cu121
2024-11-05 03:25:47,858 - root - INFO - xformers version: 0.0.27
2024-11-05 03:25:47,858 - root - INFO - Set vram state to: NORMAL_VRAM
2024-11-05 03:25:47,858 - root - INFO - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2024-11-05 03:25:50,594 - root - INFO - 
Import times for custom nodes:
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\websocket_image_save.py
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\AIGODLIKE-ComfyUI-Translation
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ControlNet-LLLite-ComfyUI
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\FreeU_Advanced
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI_TiledKSampler
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\stability-ComfyUI-nodes
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI_experiments
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-WD14-Tagger
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\PowerNoiseSuite
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\images-grid-comfy-plugin
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Custom-Scripts
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI_UltimateSDUpscale
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\Derfuu_ComfyUI_ModdedNodes
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\efficiency-nodes-comfyui
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI_IPAdapter_plus
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\comfyui-workspace-manager
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI_Comfyroll_CustomNodes
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-AnimateDiff-Evolved
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\StyleShot-ComfyUI
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\comfyui_controlnet_aux
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\rgthree-comfy
2024-11-05 03:25:50,594 - root - INFO -    0.0 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Inspire-Pack
2024-11-05 03:25:50,594 - root - INFO -    0.1 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Crystools
2024-11-05 03:25:50,594 - root - INFO -    0.1 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Advanced-ControlNet
2024-11-05 03:25:50,594 - root - INFO -    0.2 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI_FizzNodes
2024-11-05 03:25:50,594 - root - INFO -    0.2 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI
2024-11-05 03:25:50,594 - root - INFO -    0.2 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Marigold
2024-11-05 03:25:50,595 - root - INFO -    0.3 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Manager
2024-11-05 03:25:50,595 - root - INFO -    0.7 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI-Impact-Pack
2024-11-05 03:25:50,595 - root - INFO -    2.2 seconds: E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\ComfyUI_Custom_Nodes_AlekPet
2024-11-05 03:25:50,595 - root - INFO - 
2024-11-05 03:25:50,603 - root - INFO - Starting server

2024-11-05 03:25:50,603 - root - INFO - To see the GUI go to: http://127.0.0.1:8188
2024-11-05 03:26:29,516 - root - INFO - got prompt
2024-11-05 03:26:58,267 - root - ERROR - !!! Exception during processing !!! Allocation on device 
2024-11-05 03:26:58,269 - root - ERROR - Traceback (most recent call last):
  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 317, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 192, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "E:\cf-test\ComfyUI-aki-v1.4\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\__init__.py", line 108, in gen
    output = pipe(
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\pipeline.py", line 280, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\scheduler.py", line 163, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\model.py", line 387, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\custom_nodes\OmniGen-ComfyUI\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 790, in forward
    attn_outputs, self_attn_weights, present_key_value = self.self_attn(
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 463, in forward
    attn_weights = nn.functional.softmax(attn_weights, dim=-1, dtype=torch.float32).to(value_states.dtype)
  File "E:\cf-test\ComfyUI-aki-v1.4\python\lib\site-packages\torch\nn\functional.py", line 1887, in softmax
    ret = input.softmax(dim, dtype=dtype)
torch.cuda.OutOfMemoryError: Allocation on device 

2024-11-05 03:26:58,270 - root - ERROR - Got an OOM, unloading all loaded models.
2024-11-05 03:26:58,653 - root - INFO - Prompt executed in 29.13 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":9,"last_link_id":12,"nodes":[{"id":3,"type":"PreviewImage","pos":{"0":1021,"1":106,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"size":{"0":210,"1":246},"flags":{},"order":5,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":12,"label":"图像"}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"}},{"id":6,"type":"LoadImage","pos":{"0":-220,"1":93,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"size":{"0":315,"1":314},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[8],"shape":3,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"shape":3,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["f61528b2b5d93e18608ba66204c89a5.jpg","image"]},{"id":7,"type":"LoadImage","pos":{"0":142,"1":98,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"size":{"0":315,"1":314},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[9],"shape":3,"label":"图像"},{"name":"MASK","type":"MASK","links":null,"shape":3,"label":"遮罩"}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["99128c67186e49879105b10f6a43631.jpg","image"]},{"id":5,"type":"TextNode","pos":{"0":16,"1":-179,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"size":{"0":400,"1":200},"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"TEXT","type":"TEXT","links":[10],"slot_index":0,"shape":3,"label":"TEXT"}],"properties":{"Node name for S&R":"TextNode"},"widgets_values":["the girl in image_1 Grabbing the breasts of the girl in image_2",true]},{"id":8,"type":"OmniGenNode","pos":{"0":542,"1":267,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"size":{"0":315,"1":354},"flags":{},"order":4,"mode":0,"inputs":[{"name":"prompt_text","type":"TEXT","link":10,"label":"prompt_text"},{"name":"latent","type":"LATENT","link":11,"label":"latent"},{"name":"image_1","type":"IMAGE","link":8,"label":"image_1"},{"name":"image_2","type":"IMAGE","link":9,"label":"image_2"},{"name":"image_3","type":"IMAGE","link":null,"label":"image_3"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[12],"slot_index":0,"shape":3,"label":"IMAGE"}],"properties":{"Node name for S&R":"OmniGenNode"},"widgets_values":[30,2.5,1.6,1024,false,true,false,false,1994,"randomize"]},{"id":9,"type":"EmptyLatentImage","pos":{"0":82,"1":475,"2":0,"3":0,"4":0,"5":0,"6":0,"7":0,"8":0,"9":0},"size":{"0":315,"1":106},"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"LATENT","type":"LATENT","links":[11],"shape":3,"label":"Latent"}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[768,1024,1]}],"links":[[8,6,0,8,2,"IMAGE"],[9,7,0,8,3,"IMAGE"],[10,5,0,8,0,"TEXT"],[11,9,0,8,1,"LATENT"],[12,8,0,3,0,"IMAGE"]],"groups":[],"config":{},"extra":{"ds":{"scale":1,"offset":[834.003325613525,438.1449306111023]}},"version":0.4}

Additional Context

(Please add any additional context or steps to reproduce the error here)

@AIFSH
Copy link
Owner

AIFSH commented Nov 4, 2024

some node eat you gpu varm,delete them

@DamienCz
Copy link
Author

DamienCz commented Nov 5, 2024

some node eat you gpu varm,delete them

Still got the error. I even downloaded comfyui again and deleted all nodes except it and the prompt word, but it still got out of memory.

@DamienCz
Copy link
Author

DamienCz commented Nov 5, 2024

微信截图_20241105093631
Uploading 微信截图_20241105093700.png…

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants