Skip to content

Roo code won't connect to external Ollama server #6589

@VincentJGeisler

Description

@VincentJGeisler

App Version

3.25.6

API Provider

Ollama

Model Used

qwen3-code-32b:latest

Roo Code Task Links (Optional)

Roo wont connect to the server(non localhost). complains that the model isnt available.

confirmed that this is not an Ollama problem by connecting with a different chatbot with zero problems with ollama or the model itself.

"The model ID (Qwen3-Coder-30b:Latest) you provided is not available. Please choose a different model."

is this maybe an issue where localhost has been hardcoded rather than allowing an external ipaddress in the field?

Image

🔁 Steps to Reproduce

open up a new profile
select ollama
enter a external url eg http://10.3.4.5:11434
attempt to type in an existing model name

fail

💥 Outcome Summary

expected to connect to a model, got an error.

📄 Relevant Logs or Errors (Optional)

The model ID (Qwen3-Coder-30b:Latest) you provided is not available. Please choose a different model

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue - Needs InfoMissing details or unclear. Waiting on author to provide more context.bugSomething isn't working

    Type

    No type

    Projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions