-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Closed
Labels
EnhancementNew feature or requestNew feature or requestIssue - In ProgressSomeone is actively working on this. Should link to a PR soon.Someone is actively working on this. Should link to a PR soon.feature requestFeature request, not a bugFeature request, not a bug
Description
Problem (one or two sentences)
The Roo Code extension currently lacks support for the zai-org/GLM-4.6-FP8 (200k context window) and meituan-longcat/LongCat-Flash-Thinking-FP8 (128k context window) models on the Chutes AI provider. These model performs on par or very close to with SOTA models while being much faster and cheaper.
Context (who is affected and when)
This affects all Roo Code users who needs efficient models for complex reasoning, coding, and creative tasks. There are no options for these models in Roo Code/ Chutes AI.
Desired behavior (conceptual, not technical)
The user should see and be able to select zai-org/GLM-4.6-FP8 and meituan-longcat/LongCat-Flash-Thinking-FP8 models inside Roo Code, specifically Chutes AI provider
Constraints / preferences (optional)
No response
Request checklist
- I've searched existing Issues and Discussions for duplicates
- This describes a specific problem with clear context and impact
Roo Code Task Links (optional)
No response
Acceptance criteria (optional)
No response
Proposed approach (optional)
No response
Trade-offs / risks (optional)
No response
Metadata
Metadata
Assignees
Labels
EnhancementNew feature or requestNew feature or requestIssue - In ProgressSomeone is actively working on this. Should link to a PR soon.Someone is actively working on this. Should link to a PR soon.feature requestFeature request, not a bugFeature request, not a bug
Type
Projects
Status
Done