You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Warning: You are using a model of type llava to instantiate a model of type llava_qwen. This is not supported for all configurations of models and can yield errors.
#350
I have met "You are using a model of type llava to instantiate a model of type llava_qwen. This is not supported for all configurations of models and can yield errors." when I try to inference llava-onevision. Addtionally, when I try to inference "llava-v1.6-vicuna-7b", similar warning is "You are using a model of type llava to instantiate a model of type llava_llama. This is not supported for all configurations of models and can yield errors."
I wonder if this will have an impact on model inference.
The text was updated successfully, but these errors were encountered:
This happened when you load a LLM model into a MLLM model. (i.e. MLLM.from_pretrained(LLM))
I think it should be printed when ur training the model, but not for inference. (unless you use model_base option)
I have met "You are using a model of type llava to instantiate a model of type llava_qwen. This is not supported for all configurations of models and can yield errors." when I try to inference llava-onevision. Addtionally, when I try to inference "llava-v1.6-vicuna-7b", similar warning is "You are using a model of type llava to instantiate a model of type llava_llama. This is not supported for all configurations of models and can yield errors."
I wonder if this will have an impact on model inference.
The text was updated successfully, but these errors were encountered: