Skip to content

GLM 4.5 OR Chutes AI error 400 #6806

@markp018

Description

@markp018

App Version

3.25.8

API Provider

OpenRouter

Model Used

Z.al/GLM 4.5

Roo Code Task Links (Optional)

No response

🔁 Steps to Reproduce

Just asking any question.

💥 Outcome Summary

Error:

{
"object": "error",
"message": "Requested token count exceeds the model's maximum context length of 98304 tokens. You requested a total of 115623 tokens: 10527 tokens from the input messages and 105096 tokens for the completion. Please reduce the number of tokens in the input messages or the completion to fit within the limit.",
"type": "BadRequestError",
"param": null,
"code": 400
}

When I switched back to v3.25.7, the problem is gone.

📄 Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions