Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

直接推理llava-v1.5-7B模型有如下问题AttributeError: 'NoneType' object has no attribute 'image_seqlen' #5611

Closed
1 task done
Y-PanC opened this issue Oct 5, 2024 · 7 comments
Labels
solved This problem has been already solved

Comments

@Y-PanC
Copy link

Y-PanC commented Oct 5, 2024

Reminder

  • I have read the README and searched the existing issues.

System Info

您好!我是下载llava-v.15-7B模型后直接用api做推理,报AttributeError: 'NoneType' object has no attribute 'image_seqlen'错误。
这个问题困扰了我一周,希望得到您的指点。
具体细节如下
Traceback (most recent call last):
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in call
return await self.app(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/applications.py", line 113, in call
await self.middleware_stack(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/middleware/errors.py", line 187, in call
raise exc
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/middleware/errors.py", line 165, in call
await self.app(scope, receive, _send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/middleware/cors.py", line 85, in call
await self.app(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/routing.py", line 715, in call
await self.middleware_stack(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/routing.py", line 735, in app
await route.handle(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/routing.py", line 76, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/starlette/routing.py", line 73, in app
response = await f(request)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/fastapi/routing.py", line 301, in app
raw_response = await run_endpoint_function(
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
return await dependant.call(**values)
File "/home/ubuntu/pqj/math/LLaMA-Factory/src/llamafactory/api/app.py", line 111, in create_chat_completion
return await create_chat_completion_response(request, chat_model)
File "/home/ubuntu/pqj/math/LLaMA-Factory/src/llamafactory/api/chat.py", line 147, in create_chat_completion_response
responses = await chat_model.achat(
File "/home/ubuntu/pqj/math/LLaMA-Factory/src/llamafactory/chat/chat_model.py", line 91, in achat
return await self.engine.chat(messages, system, tools, image, video, **input_kwargs)
File "/home/ubuntu/pqj/math/LLaMA-Factory/src/llamafactory/chat/hf_engine.py", line 292, in chat
return await loop.run_in_executor(pool, self._chat, *input_args)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/concurrent/futures/thread.py", line 52, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/ubuntu/miniconda3/envs/panc_math_vscode/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "/home/ubuntu/pqj/math/LLaMA-Factory/src/llamafactory/chat/hf_engine.py", line 189, in _chat
gen_kwargs, prompt_length = HuggingfaceEngine._process_args(
File "/home/ubuntu/pqj/math/LLaMA-Factory/src/llamafactory/chat/hf_engine.py", line 97, in _process_args
messages = template.mm_plugin.process_messages(
File "/home/ubuntu/pqj/math/LLaMA-Factory/src/llamafactory/data/mm_plugin.py", line 237, in process_messages
image_seqlen = getattr(processor, "image_seqlen")
AttributeError: 'NoneType' object has no attribute 'image_seqlen'

Reproduction

我的API部署指令如下
API_PORT=8003 CUDA_VISIBLE_DEVICES=5 llamafactory-cli api /home/ubuntu/pqj/math/LLaMA-Factory/examples/inference/llava-v1.5-7b_api.yaml

llava-v1.5-7b_api.yaml如下
model_name_or_path: /mnt/ssd2/models/llava-v1.5-7b
template: llava

.py文件如下
from openai import OpenAI

import os
import base64
import json
import re
import shutil
from time import sleep

def encode_image(image_path):
if image_path.startswith("http"):
return image_path
with open(image_path, "rb") as image_file:
base64_image = base64.b64encode(image_file.read()).decode('utf-8')
# print(base64_image)
return f"data:image/jpeg;base64,{base64_image}"

client = OpenAI(api_key="0",base_url="http://0.0.0.0:8003/v1")
content = [{"type": "text", "text": "如图, 在四边形 $A B C D$ 中, 设 $\overrightarrow{A B}=\boldsymbol{a}, \overrightarrow{A D}=\boldsymbol{b}, \overrightarrow{B C}=\boldsymbol{c}$, 则 $\overrightarrow{D C}$ 等于\n\nA. $\boldsymbol{a}-\boldsymbol{b}+\boldsymbol{c}$\nB. $\boldsymbol{b}-(\boldsymbol{a}+\boldsymbol{c})$\nC. $\boldsymbol{a}+\boldsymbol{b}+\boldsymbol{c}$\nD. $\boldsymbol{b}-\boldsymbol{a}+\boldsymbol{c}$"}]

image_path = '/home/ubuntu/pqj/math/data/Test_Images/9017.jpg'
content.append({
"type" : "image_url",
"image_url": {
"url": encode_image(image_path)
}
})

messages = [{"role": "user", "content": content}]

print(messages)

result = client.chat.completions.create(messages=messages, model="/mnt/ssd2/models/llava-v1.5-7b")
print(result.choices[0].message)

Expected behavior

No response

Others

No response

@github-actions github-actions bot added the pending This problem is yet to be addressed label Oct 5, 2024
@Y-PanC Y-PanC changed the title 直接推理llava-v1.5-7B模型的 直接推理llava-v1.5-7B模型有如下问题AttributeError: 'NoneType' object has no attribute 'image_seqlen' Oct 5, 2024
@hiyouga
Copy link
Owner

hiyouga commented Oct 6, 2024

@hiyouga hiyouga added solved This problem has been already solved and removed pending This problem is yet to be addressed labels Oct 6, 2024
@hiyouga hiyouga closed this as completed Oct 6, 2024
hiyouga added a commit that referenced this issue Oct 6, 2024
hiyouga added a commit that referenced this issue Oct 6, 2024
@Bella0818
Copy link

Did you solve it?

@66RomanReigns
Copy link

下载 https://huggingface.co/llava-hf/llava-1.5-7b-hf

请问在训练Yi-VL-6B时遇到同样的问题,该如何解决?希望得到您的指点,谢谢!

@66RomanReigns
Copy link

下载 https://huggingface.co/llava-hf/llava-1.5-7b-hf

请问在训练Yi-VL-6B时遇到同样的问题,该如何解决?希望得到您的指点,谢谢!

rank0: Traceback (most recent call last):
rank0: File "/root/miniconda3/envs/llama/lib/python3.11/site-packages/multiprocess/pool.py", line 125, in worker
rank0: result = (True, func(*args, **kwds))
rank0: ^^^^^^^^^^^^^^^^^^^
rank0: File "/root/miniconda3/envs/llama/lib/python3.11/site-packages/datasets/utils/py_utils.py", line 678, in _write_generator_to_queue
rank0: for i, result in enumerate(func(**kwargs)):
rank0: File "/root/miniconda3/envs/llama/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3458, in _map_single
rank0: batch = apply_function_on_filtered_inputs(
rank0: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
rank0: File "/root/miniconda3/envs/llama/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3320, in apply_function_on_filtered_inputs
rank0: processed_inputs = function(*fn_args, *additional_args, **fn_kwargs)
rank0: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
rank0: File "/root/autodl-tmp/LLaMA-Factory-main/src/llamafactory/data/processors/supervised.py", line 107, in preprocess_supervised_dataset
rank0: input_ids, labels = _encode_supervised_example(
rank0: ^^^^^^^^^^^^^^^^^^^^^^^^^^^
rank0: File "/root/autodl-tmp/LLaMA-Factory-main/src/llamafactory/data/processors/supervised.py", line 48, in _encode_supervised_example
rank0: messages = template.mm_plugin.process_messages(prompt + response, images, videos, processor)
rank0: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
rank0: File "/root/autodl-tmp/LLaMA-Factory-main/src/llamafactory/data/mm_plugin.py", line 255, in process_messages
rank0: image_seqlen = getattr(processor, "image_seqlen")
rank0: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
rank0: AttributeError: 'NoneType' object has no attribute 'image_seqlen'
rank0: """

rank0: The above exception was the direct cause of the following exception:

@TankNee
Copy link

TankNee commented Nov 29, 2024

下载 https://huggingface.co/llava-hf/llava-1.5-7b-hf

请问在训练Yi-VL-6B时遇到同样的问题,该如何解决?希望得到您的指点,谢谢!

same question

@hiyouga
Copy link
Owner

hiyouga commented Nov 29, 2024

@TankNee
Copy link

TankNee commented Nov 30, 2024

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

5 participants