You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Model Class: LlavaLlamaForCausalLM
Traceback (most recent call last):
File "/data/hongbo.xu/Datasets/MC-ERU/llava-next/LLaVA-NeXT/inference.py", line 30, in
prompt_question = conv.get_prompt()
File "/data/hongbo.xu/Datasets/MC-ERU/llava-next/LLaVA-NeXT/llava/conversation.py", line 99, in get_prompt
raise ValueError("Llama 3 tokenizer is not available. Make sure you have the necessary permissions.")
ValueError: Llama 3 tokenizer is not available. Make sure you have the necessary permissions.
The text was updated successfully, but these errors were encountered:
conv_llava_llama_3 = Conversation(
system="You are a helpful language and vision assistant. " "You are able to understand the visual content that the user provides, " "and assist the user with a variety of tasks using natural language.",
roles=("user", "assistant"),
version="llama_v3",
messages=[],
offset=0,
sep="<|eot_id|>",
sep_style=SeparatorStyle.LLAMA_3,
# tokenizer_id="meta-llama/Meta-Llama-3-8B-Instruct", # Change to your local path.
# tokenizer=safe_load_tokenizer("meta-llama/Meta-Llama-3-8B-Instruct"), # Change to your local path.
stop_token_ids=[128009],
)
https://github.com/LLaVA-VL/LLaVA-NeXT/blob/main/docs/LLaVA-NeXT.md
I am using the inference code from this, and I encountered this issue. Is it because I need to set the path for Llama 3? I only set the path for "mm_vision_tower" in the model weights config file.
Model Class: LlavaLlamaForCausalLM
Traceback (most recent call last):
File "/data/hongbo.xu/Datasets/MC-ERU/llava-next/LLaVA-NeXT/inference.py", line 30, in
prompt_question = conv.get_prompt()
File "/data/hongbo.xu/Datasets/MC-ERU/llava-next/LLaVA-NeXT/llava/conversation.py", line 99, in get_prompt
raise ValueError("Llama 3 tokenizer is not available. Make sure you have the necessary permissions.")
ValueError: Llama 3 tokenizer is not available. Make sure you have the necessary permissions.
The text was updated successfully, but these errors were encountered: