Skip to content

Conversation

@Qard
Copy link

@Qard Qard commented Jan 10, 2026

What does this PR do?

Attempts to auto-discover models from OpenAI-compatible providers to allow local LLM providers like Ollama and LM Studio to show their model list without needing to manually add each model. This does not auto-discover the provider itself nor does it modify model entries which already exist, so this should be non-destructive with any already provided model data.

How did you verify your code works?

I ran this locally against my Ollama instance and it showed all my local models, switching to models with auto-generated configs worked just fine with this change.

@github-actions
Copy link
Contributor

Hey! Your PR title core: auto-discover models from OpenAI-compatible local providers doesn't follow conventional commit format.

Please update it to start with one of:

  • feat: or feat(scope): new feature
  • fix: or fix(scope): bug fix
  • docs: or docs(scope): documentation changes
  • chore: or chore(scope): maintenance tasks
  • refactor: or refactor(scope): code refactoring
  • test: or test(scope): adding or updating tests

Where scope is the package name (e.g., app, desktop, opencode).

See CONTRIBUTING.md for details.

@github-actions
Copy link
Contributor

The following comment was made by an LLM, it may be inaccurate:

Based on my search, I found two potentially related PRs:

  1. PR Fetch from model list from /v1/models/ from OpenAI Compatible APIs #3427: "Fetch from model list from /v1/models/ from OpenAI Compatible APIs"

  2. PR Adding the auto-detection of ollama local with a variable for baseURL #3726: "Adding the auto-detection of ollama local with a variable for baseURL"

These PRs may have overlapping functionality or could provide context on previous attempts to solve similar auto-discovery issues.

@Qard Qard closed this Jan 10, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant