Skip to content

Commit

Permalink
Fix bug in building messages with async models
Browse files Browse the repository at this point in the history
Co-authored-by: Sukhbinder Singh <sukhbindersingh@gmail.com>
  • Loading branch information
sukhbinder and Sukhbinder Singh authored Dec 20, 2024
1 parent e5d981b commit 39f73a3
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 2 deletions.
4 changes: 3 additions & 1 deletion llm_ollama.py
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,9 @@ def build_messages(self, prompt, conversation):
for attachment in prev_response.attachments
]

messages.append({"role": "assistant", "content": prev_response.text()})
messages.append(
{"role": "assistant", "content": prev_response.text_or_raise()}
)
if prompt.system and prompt.system != current_system:
messages.append({"role": "system", "content": prompt.system})
messages.append({"role": "user", "content": prompt.prompt})
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "llm-ollama"
version = "0.8.0"
version = "0.8.1"
description = "LLM plugin providing access to local Ollama models"
readme = "README.md"
authors = [{ name = "Sergey Alexandrov" }]
Expand Down

0 comments on commit 39f73a3

Please sign in to comment.