You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This job is awesome! I'm trying to use it. I attempted to let it just make up a story.
I ran a local LLM deepseek-v2 using Ollama, but I encountered the following error:
Error
Text
an error was encountered while running the model: unexpected EOF (status code: -1)
Copy
Traceback (most recent call last):
Traceback (most recent call last):
File "/a0/agent.py", line 287, in monologue
agent_response = await self.call_chat_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
>>> 1 stack lines skipped <<<
File "/opt/venv/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3387, in _atransform
async for output in final_pipeline:
File "/opt/venv/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1473, in atransform
async for output in self.astream(final, config, **kwargs):
File "/opt/venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 509, in astream
async for chunk in self._astream(
File "/opt/venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 755, in _astream
async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
File "/opt/venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 575, in _acreate_chat_stream
async for part in await self._async_client.chat(**chat_params):
File "/opt/venv/lib/python3.11/site-packages/ollama/_client.py", line 677, in inner
raise ResponseError(err)
ollama._types.ResponseError: an error was encountered while running the model: unexpected EOF (status code: -1)
Relevant information:
Ubuntu 20.04
Docker info:
Client: Docker Engine - Community
Version: 28.0.1
API version: 1.47 (downgraded from 1.48)
Go version: go1.23.6
Git commit: 068a01e
Built: Wed Feb 26 10:41:20 2025
OS/Arch: linux/amd64
Context: desktop-linux
Server: Docker Desktop 4.38.0 (181591)
Engine:
Version: 27.5.1
API version: 1.47 (minimum version 1.24)
Go version: go1.22.11
Git commit: 4c9b3b0
Built: Wed Jan 22 13:41:17 2025
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: 1.7.25
GitCommit: bcc810d6b9066471b0b6fa75f557a15a1cbf31bb
runc:
Version: 1.1.12
GitCommit: v1.1.12-0-g51d5e946
docker-init:
Version: 0.19.0
GitCommit: de40ad0
This job is awesome! I'm trying to use it. I attempted to let it just make up a story.
I ran a local LLM deepseek-v2 using Ollama, but I encountered the following error:
Relevant information:
The text was updated successfully, but these errors were encountered: