-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Description
Reopening Issue #6589
This issue was closed as stale in #6589, however the problem still exists and is being actively reported by multiple users in the most recent comments on that thread (after closure).
Problem Summary
Roo Code cannot connect to Ollama servers on non-localhost addresses (e.g., http://10.3.4.5:11434 or LAN IPs). This works fine with localhost addresses and works with other AI tools connecting to the same Ollama servers, indicating this is specific to Roo Code.
Evidence from Original Thread
From the comments AFTER the issue was closed:
- john970: "I also have this issue. I'd bet this happens to anyone who is trying to use ollama on their LAN and not localhost on either macOS or Linux."
- oliver124810: Mentioned it as a UI issue that requires trial and error to work
- kneczaj: Posted a screenshot showing Ollama is accessible but Roo Code can't connect (2 days ago)
The original reporter confirmed the issue still exists on macOS even after the refactor to use the official Ollama SDK.
Reproduction Steps
- Set up Ollama server on a non-localhost address (e.g., LAN IP like
http://192.168.1.100:11434) - Verify Ollama is accessible via curl or other AI tools
- In Roo Code, select Ollama provider
- Enter the external URL
- Attempt to select a model
Expected: Should connect and list available models
Actual: Error stating model is not available
Workarounds Found
Users have found these workarounds (which shouldn't be necessary):
- SSH tunnel:
ssh -L 11434:localhost:11434 user@remote_server_ip - Using a VPN/tunnel from a different network
- Using ngrok
- Switching to OpenAI Compatible provider with
/v1appended to URL
Request
This issue will be persistently reopened until either:
- The bug is properly fixed, or
- A logical technical reason is provided for why external Ollama servers cannot/will not be supported
Closing as "stale" when users are actively reporting the issue is not acceptable. If this is a won't-fix, please document it clearly in the docs as a known limitation.
Environment
- Affects: macOS (confirmed), Linux (likely)
- Ollama: Latest versions
- Other tools: Can connect to same Ollama servers without issue
Reference: #6589