Skip to content

I can not configure the limit for litellm #2787

@m-da-costa

Description

@m-da-costa

Sorry if my question is a basic one but I cannot configure the context window (limits) for my litellm instance. Se examples:
config.json

    "litellm": {
      "npm": "@ai-sdk/openai-compatible",
      "options": {
        "baseURL": "https://my.domain.com/v1"
      },
      "models": {
        "cerebras/qwen-3-coder-480b": {
          "tools": true,
          "reasoning": true,
          "context_window": 120000,
          "limit": {
            "context": 120000,
            "output": 64000
          }
        },
      }
    }
Image

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions