You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/mnt/c/Users/luyuh/PycharmProjects/LLMTest/OS-Atlas/Atlas4B.py", line 97, in <module>
response, history = model.chat(tokenizer, pixel_values, question, generation_config, history=None, return_history=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/huggingface/modules/transformers_modules/OS-Copilot/OS-Atlas-Base-4B/67bccd1cb9605c0d514adbbbc1b512d93a09df08/modeling_internvl_chat.py", line 263, in chat
generation_output = self.generate(
^^^^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/huggingface/modules/transformers_modules/OS-Copilot/OS-Atlas-Base-4B/67bccd1cb9605c0d514adbbbc1b512d93a09df08/modeling_internvl_chat.py", line 314, in generate
outputs = self.language_model.generate(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/transformers/generation/utils.py", line 2215, in generate
result = self._sample(
^^^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/transformers/generation/utils.py", line 3206, in _sample
outputs = self(**model_inputs, return_dict=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/huggingface/modules/transformers_modules/OS-Copilot/OS-Atlas-Base-4B/67bccd1cb9605c0d514adbbbc1b512d93a09df08/modeling_phi3.py", line 1281, in forward
outputs = self.model(
^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/anaconda3/envs/LLMTest/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ubuntu/.cache/huggingface/modules/transformers_modules/OS-Copilot/OS-Atlas-Base-4B/67bccd1cb9605c0d514adbbbc1b512d93a09df08/modeling_phi3.py", line 1110, in forward
position_ids = position_ids.view(-1, seq_length).long()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: shape '[-1, 0]' is invalid for input of size 881
classPhi3ForCausalLM(Phi3PreTrainedModel):
defprepare_input_for_generation(
......
# if `inputs_embeds` are passed, we only want to use them in the 1st generation step# origin code is:# if inputs_embeds is not None and past_key_values is None:# Changed:ifinputs_embedsisnotNoneandinput_ids.shape[1]==0:
model_inputs= {'inputs_embeds': inputs_embeds}
else:
model_inputs= {'input_ids': input_ids}
classPhi3ForCausalLM(Phi3PreTrainedModel):
defprepare_input_for_generation(
......
# if `inputs_embeds` are passed, we only want to use them in the 1st generation step# origin code is:# if inputs_embeds is not None and past_key_values is None:# Changed:ifinputs_embedsisnotNoneandinput_ids.shape[1]==0:
model_inputs= {'inputs_embeds': inputs_embeds}
else:
model_inputs= {'input_ids': input_ids}
在运行readme中4B示例代码时会由于input_ids为零长向量导致报错:
看huggingface下载下来的代码,似乎向Phi3ForCausalLM的generate方法传入了inputs_embeds参数,超出了模型方法调用范围,导致无法正确处理。
The text was updated successfully, but these errors were encountered: