-
-
Notifications
You must be signed in to change notification settings - Fork 100
Description
Is it a request payload issue?
[_] Yes, this is a request payload issue. I am using a client/cURL to send a request payload, but I received an unexpected error.
[x] No, it's another issue.
Describe the bug
When adding an OpenAI-compatible provider, CLIProxyAPIPlus appears to hardcode OpenAI’s model-discovery endpoint as GET /v1/models and /v1/chat/completions.
This breaks providers that expose an OpenAI-like API but use a different base path/versioning. In my case, Z.ai uses a v4 API and lists models at:
https://api.z.ai/api/coding/paas/v4/models
However, the UI tries to call:
https://api.z.ai/api/coding/paas/v4/v1/models
This results in 401 Authorization Token Missing in the UI, while calling Z.ai’s /models endpoint directly with the same token works.
CLI Type
openai-compatibility
Model Name
glm-4.5 (also tested: glm-4.5-air)
LLM Client
CLIProxyAPIPlus Web UI ("AI Providers" -> "Add OpenAI Compatible Provider")
Request Information
Working (Z.ai native models endpoint):
curl -H "Authorization: Bearer $ZAI_API_KEY" \
"https://api.z.ai/api/coding/paas/v4/models"Failing (what CLIProxyAPIPlus tries to call when picking models):
GET https://api.z.ai/api/coding/paas/v4/v1/models
-> 401 Authorization Token Missing
Expected behavior
- Allow configuring the models listing endpoint (or disable auto-discovery).
- Alternatively, treat the provider Base URL as fully authoritative (do not append
/v1/modelsunconditionally), or provide a way to override the appended path.
OS Type
- OS: Debian 13 (Docker)
- Version: v6.6.108-0-plus, Commit: d611134, BuiltAt: 2026-01-15T05:31:56Z
Additional context
Running the container image:
docker.io/eceasy/cli-proxy-api-plus:latest