Skip to content

Roo Code won't connect to external Ollama server (Reopening #6589) #11466

@VincentJGeisler

Description

@VincentJGeisler

Reopening Issue #6589

This issue was closed as stale in #6589, however the problem still exists and is being actively reported by multiple users in the most recent comments on that thread (after closure).

Problem Summary

Roo Code cannot connect to Ollama servers on non-localhost addresses (e.g., http://10.3.4.5:11434 or LAN IPs). This works fine with localhost addresses and works with other AI tools connecting to the same Ollama servers, indicating this is specific to Roo Code.

Evidence from Original Thread

From the comments AFTER the issue was closed:

  • john970: "I also have this issue. I'd bet this happens to anyone who is trying to use ollama on their LAN and not localhost on either macOS or Linux."
  • oliver124810: Mentioned it as a UI issue that requires trial and error to work
  • kneczaj: Posted a screenshot showing Ollama is accessible but Roo Code can't connect (2 days ago)

The original reporter confirmed the issue still exists on macOS even after the refactor to use the official Ollama SDK.

Reproduction Steps

  1. Set up Ollama server on a non-localhost address (e.g., LAN IP like http://192.168.1.100:11434)
  2. Verify Ollama is accessible via curl or other AI tools
  3. In Roo Code, select Ollama provider
  4. Enter the external URL
  5. Attempt to select a model

Expected: Should connect and list available models
Actual: Error stating model is not available

Workarounds Found

Users have found these workarounds (which shouldn't be necessary):

  • SSH tunnel: ssh -L 11434:localhost:11434 user@remote_server_ip
  • Using a VPN/tunnel from a different network
  • Using ngrok
  • Switching to OpenAI Compatible provider with /v1 appended to URL

Request

This issue will be persistently reopened until either:

  1. The bug is properly fixed, or
  2. A logical technical reason is provided for why external Ollama servers cannot/will not be supported

Closing as "stale" when users are actively reporting the issue is not acceptable. If this is a won't-fix, please document it clearly in the docs as a known limitation.

Environment

  • Affects: macOS (confirmed), Linux (likely)
  • Ollama: Latest versions
  • Other tools: Can connect to same Ollama servers without issue

Reference: #6589

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions