-
Notifications
You must be signed in to change notification settings - Fork 271
Description
ClawRouter continues consuming credits despite primary model set to Codex
Problem Description
After configuring openai-codex/gpt-5.3-codex as the primary model in the configuration file, ClawRouter continues to intercept and consume credits for every request, completely ignoring the model setting.
Documented Credit Loss
Time | Remaining Credit | Loss -- | -- | -- 11:42 | $5.45 | - 11:56 | $4.02 | -$1.43 12:01 | $3.78 | -$0.24 12:07 | $3.19 | -$0.59 Total | | -$2.26+Current Configuration (agents.defaults.model)
{
"primary": "codex",
"fallbacks": ["auto", "openrouter/auto", "gemini-2.5-flash"]
}
Expected Behavior
- Requests should use OpenAI Codex (OAuth authentication, included in subscription)
- No ClawRouter credit consumption for normal chat requests
Actual Behavior
- [ClawRouter] Received model: "auto" in logs despite
primary: "codex" - Usage of paid models (
xai/grok-4-fast-reasoning,moonshot/kimi-k2.5) - Cost per call: ~$0.0033
- Current session shows
model: "auto"instead of"codex"
Relevant Logs
[ClawRouter] Received model: "auto" -> normalized: "auto", isAuto: true
[ClawRouter] Tools detected (23), forcing agentic mode
[ClawRouter] Trying model 1/2: moonshot/kimi-k2.5
[ClawRouter] Success with model: moonshot/kimi-k2.5
[plugins] [REASONING] moonshot/kimi-k2.5 $0.0033 (saved 99%)
Steps to Reproduce
- Configure
primary: "codex"inagents.defaults.model - Restart OpenClaw (SIGUSR1)
- Verify
session_statusshowsmodel: "openai-codex/gpt-5.3-codex" - Execute any request
- Observe logs still show
[ClawRouter] Received model: "auto"
Attempted Solutions
- Changed
primaryfromblockrun/autotocodex - Removed
autofrom fallbacks - Set
session_statuswithmodel: "codex"for current session - Restarted OpenClaw multiple times
- Completely disabled ClawRouter plugin - not attempted for fear of losing functionality
Environment
- OpenClaw Version: 2026.2.9 (33c75cb)
- OS: Linux 6.12.63+deb13-amd64 (x64)
- Node: v22.22.0
Additional Context
The issue appears to be that even when the session status shows the correct model (openai-codex/gpt-5.3-codex), the underlying tool calls or agent processing still routes through ClawRouter with model: "auto". This suggests the model override is not being properly propagated to all subsystems, particularly when tools are involved ("Tools detected (23), forcing agentic mode").
Possible Root Causes
- Tool calls force
agentic modewhich overrides user model preference - Session model override doesn't propagate to ClawRouter plugin
autoalias in fallbacks causes ClawRouter to activate despite primary being set- Configuration patch applies to defaults but active sessions retain cached model
Request
Please investigate why ClawRouter continues to intercept requests when a specific non-ClawRouter primary model is configured. The expected behavior is that explicit model selection should bypass ClawRouter entirely unless explicitly requested via /model auto.