[Bug][langchain_mistralai] KeyError: 'choices' in streaming response when using Vertex AI Model Garden Mistral integration #27886
Labels
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Checked other resources
Example Code
The following code:
Error Message and Stack Trace (if applicable)
printed chunks from _stream() and _astream function in langchain_mistralai/chat_models.py:
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'role': 'assistant', 'content': ''}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': 'Hello'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '!'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' How'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' can'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' I'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' assist'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' you'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' today'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '?'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' Let'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' me'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' know'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' if'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' you'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' have'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' any'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' questions'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' or'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' topics'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' you'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': "'"}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': 'd'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' like'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' to'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' discuss'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '.'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' '}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '😊'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ''}, 'finish_reason': 'stop', 'logprobs': None}], 'usage': {'prompt_tokens': 5, 'total_tokens': 33, 'completion_tokens': 28}}
{'usage': {'output_tokens': 0}}
Error message
File ".../assistant-chatbot.py", line 110, in
llm.invoke("hello")
File ".../langchain_core/language_models/chat_models.py", line 286, in invoke
self.generate_prompt(
File ".../langchain_core/language_models/chat_models.py", line 786, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../langchain_core/language_models/chat_models.py", line 643, in generate
raise e
File ".../langchain_core/language_models/chat_models.py", line 633, in generate
self._generate_with_cache(
File ".../langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File ".../langchain_mistralai/chat_models.py", line 533, in _generate
return generate_from_stream(stream_iter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../langchain_core/language_models/chat_models.py", line 90, in generate_from_stream
generation += list(stream)
^^^^^^^^^^^^
File ".../langchain_mistralai/chat_models.py", line 591, in _stream
if len(chunk["choices"]) == 0:
~~~~~^^^^^^^^^^^
KeyError: 'choices'
Description
Here's a comprehensive description for the GitHub issue:
Description
I encountered a bug while using the Mistral AI model through Google Cloud's Vertex AI Model Garden integration with LangChain. The issue occurs in the streaming implementation when handling chunks from the model's response.
Current Behavior
When streaming responses from the Mistral model through Vertex AI, the code fails with a KeyError when trying to access the "choices" key. Looking at the actual stream of chunks, I can see that most chunks have the following structure:
However, the last chunk in the stream has a different structure:
This last chunk is causing the KeyError because it doesn't contain the 'choices' key that the code expects. The current implementation in _stream and _astream methods:
assumes this key always exists, which causes the code to crash when processing the final usage statistics chunk.
Proposed Solution
The issue can be fixed by safely accessing the "choices" key using the
get()
method with a default empty list:This change would:
Impact
This issue affects users who are using the Mistral model through Vertex AI's Model Garden with streaming enabled. The bug prevents successful completion of streaming responses, even though the model itself is working correctly and generating proper responses.
Additional Context
The full stream of chunks shows that the model is working correctly and generating proper responses - the issue is purely in the handling of the final usage statistics chunk in the LangChain integration code.
System Info
System Information
Package Information
The text was updated successfully, but these errors were encountered: