Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama._types.ResponseError: an error was encountered while running the model: unexpected EOF (status code: -1) #342

Open
Alen-Zeng opened this issue Mar 7, 2025 · 0 comments

Comments

@Alen-Zeng
Copy link

Alen-Zeng commented Mar 7, 2025

This job is awesome! I'm trying to use it. I attempted to let it just make up a story.

Image

I ran a local LLM deepseek-v2 using Ollama, but I encountered the following error:

Error
Text	
an error was encountered while running the model: unexpected EOF (status code: -1)
Copy
Traceback (most recent call last):
Traceback (most recent call last):
  File "/a0/agent.py", line 287, in monologue
    agent_response = await self.call_chat_model(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^

>>>  1 stack lines skipped <<<

  File "/opt/venv/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3387, in _atransform
    async for output in final_pipeline:
  File "/opt/venv/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1473, in atransform
    async for output in self.astream(final, config, **kwargs):
  File "/opt/venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 509, in astream
    async for chunk in self._astream(
  File "/opt/venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 755, in _astream
    async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
  File "/opt/venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 575, in _acreate_chat_stream
    async for part in await self._async_client.chat(**chat_params):
  File "/opt/venv/lib/python3.11/site-packages/ollama/_client.py", line 677, in inner
    raise ResponseError(err)
ollama._types.ResponseError: an error was encountered while running the model: unexpected EOF (status code: -1)

Relevant information:

  • Ubuntu 20.04
  • Docker info:
Client: Docker Engine - Community
 Version:           28.0.1
 API version:       1.47 (downgraded from 1.48)
 Go version:        go1.23.6
 Git commit:        068a01e
 Built:             Wed Feb 26 10:41:20 2025
 OS/Arch:           linux/amd64
 Context:           desktop-linux

Server: Docker Desktop 4.38.0 (181591)
 Engine:
  Version:          27.5.1
  API version:      1.47 (minimum version 1.24)
  Go version:       go1.22.11
  Git commit:       4c9b3b0
  Built:            Wed Jan 22 13:41:17 2025
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          1.7.25
  GitCommit:        bcc810d6b9066471b0b6fa75f557a15a1cbf31bb
 runc:
  Version:          1.1.12
  GitCommit:        v1.1.12-0-g51d5e946
 docker-init:
  Version:          0.19.0
  GitCommit:        de40ad0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant