Avoid side effects if chat message contains images #158
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The Issue
Generating chat responses with images has a rather nasty side effect in that, if a message has images, they will be overwritten by their Base64 encoding.
If a user passed an image path, it is undesirable to have this path overwritten by extremely verbose Base64-text.
The Cause
Most python users will pass messages as sequences of dictionaries. Dictionaries are inherently mutable object that are passed by reference, which is why the line
ollama-python/ollama/_client.py
Lines 174 to 175 in cb81f52
also affects the user's list/tuple of messages.
The Solution
Obviously this can be avoided on the user's end by simply making a deep copy themselves before passing
messages
. But because this is a rather obscure issue that is never mentioned in the documentation, I recommend simply removing the possiblility of this side-effect.There are multiple possible solutions to this issue, but because the performance cost of making a deep copy is insignificant in comparison to the performance cost of LLM inference, I went with the easiest fix: simply making a deep copy of
messages
every timeollama.chat
is called, but I am looking forward to hearing the opinions of the Ollama maintainers.