-
Notifications
You must be signed in to change notification settings - Fork 353
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gemini-1.0-pro-001 raises ValueError: Content roles do not match: model != #3507
Comments
Sorry, this seems to be fixed in the latest version of the package, so I will close this. |
@yifanmai google-cloud-aiplatform==1.45.0 |
Same here |
I faced the same issue too. Here's the full traceback
|
@bhavan-kaya |
Please see also https://issuetracker.google.com/issues/331677495 - you can comment there. |
Can you try with the following versions? google-cloud-aiplatform==1.46.0 |
Still the same. Traceback (most recent call last):
File "C:\Users\ASUS\PycharmProjects\pythonProject18\one.py", line 451, in <module>
generate_streaming_mistral_response(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\one.py", line 60, in generate_streaming_mistral_response
for chunk in chain_with_summarization.stream(user_input, {"configurable": {"session_id": conversation_id}}):
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 2822, in stream
yield from self.transform(iter([input]), config, **kwargs)
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 2809, in transform
yield from self._transform_stream_with_config(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 1880, in _transform_stream_with_config
chunk: Output = context.run(next, iterator) # type: ignore
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 2773, in _transform
for output in final_pipeline:
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 4669, in transform
yield from self.bound.transform(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 4669, in transform
yield from self.bound.transform(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 2809, in transform
yield from self._transform_stream_with_config(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 1880, in _transform_stream_with_config
chunk: Output = context.run(next, iterator) # type: ignore
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 2773, in _transform
for output in final_pipeline:
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 4669, in transform
yield from self.bound.transform(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 2809, in transform
yield from self._transform_stream_with_config(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 1880, in _transform_stream_with_config
chunk: Output = context.run(next, iterator) # type: ignore
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 2773, in _transform
for output in final_pipeline:
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\output_parsers\transform.py", line 50, in transform
yield from self._transform_stream_with_config(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 1880, in _transform_stream_with_config
chunk: Output = context.run(next, iterator) # type: ignore
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\output_parsers\transform.py", line 29, in _transform
for chunk in input:
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\runnables\base.py", line 1300, in transform
yield from self.stream(final, config, **kwargs)
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\language_models\chat_models.py", line 241, in stream
raise e
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_core\language_models\chat_models.py", line 223, in stream
for chunk in self._stream(messages, stop=stop, **kwargs):
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\langchain_google_vertexai\chat_models.py", line 527, in _stream
for response in responses:
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\vertexai\generative_models\_generative_models.py", line 968, in _send_message_streaming
_append_response(full_response, chunk)
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\vertexai\generative_models\_generative_models.py", line 1877, in _append_response
_append_gapic_response(
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\vertexai\generative_models\_generative_models.py", line 1899, in _append_gapic_response
_append_gapic_candidate(base_response.candidates[idx], candidate)
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\vertexai\generative_models\_generative_models.py", line 1922, in _append_gapic_candidate
_append_gapic_content(base_candidate.content, new_candidate.content)
File "C:\Users\ASUS\PycharmProjects\pythonProject18\venv\lib\site-packages\vertexai\generative_models\_generative_models.py", line 1942, in _append_gapic_content
raise ValueError(
ValueError: Content roles do not match: model != |
Have this issue as well with both |
Having this issue with `Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): |
Environment details
google-cloud-aiplatform
version: 1.38.1Steps to reproduce
Send the prompt listed below to
gemini-1.0-pro-001
.Code example
Stack trace
Expected Behavior
The chunks returned are as follows:
Instead of getting an error, I would expect these two chunks to be successfully merged into the following. Alternatively, I would expect the error message to be less cryptic.
Edit: Changed example prompt to shorter example.
The text was updated successfully, but these errors were encountered: