-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HunyuanVideo gets stuck when loading video encoder #68
Comments
How much memory do you have? It looks like not enough, if you update the nodes there should be quantization option for the text encoder, it requires bitsandbytes installed but reduces the memory use of the text encoder by a lot. |
I have 8gb of VRAM, in which part of text encoder is there that options? Thanks |
Then the nodes are not up to date. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Starting server
To see the GUI go to: http://127.0.0.1:8188
FETCH DATA from: C:\Users\giorg\Documents\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json [DONE]
got prompt
The config attributes {'mid_block_causal_attn': True} were passed to AutoencoderKLCausal3D, but are not expected and will be ignored. Please verify your config.json configuration file.
encoded latents shape torch.Size([1, 16, 11, 96, 96])
Loading text encoder model (clipL) from: C:\Users\giorg\Documents\ComfyUI_windows_portable\ComfyUI\models\clip\clip-vit-large-patch14
Text encoder to dtype: torch.float16
Loading tokenizer (clipL) from: C:\Users\giorg\Documents\ComfyUI_windows_portable\ComfyUI\models\clip\clip-vit-large-patch14
C:\Users\giorg\Documents\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\tokenization_utils_base.py:1601: FutureWarning:
clean_up_tokenization_spaces
was not set. It will be set toTrue
by default. This behavior will be depracted in transformers v4.45, and will be then set toFalse
by default. For more details check this issue: huggingface/transformers#31884warnings.warn(
Loading text encoder model (llm) from: C:\Users\giorg\Documents\ComfyUI_windows_portable\ComfyUI\models\LLM\llava-llama-3-8b-text-encoder-tokenizer
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████| 4/4 [12:41<00:00, 190.36s/it]
The text was updated successfully, but these errors were encountered: