-
Notifications
You must be signed in to change notification settings - Fork 144
Labels
Description
When using the Anthropic provider and reaching the upper limits of the available context length, we can end up in scenarios where the input tokens + the max_tokens value is > than the max supported context length for a given model.
To fix this, we need to keep in consideration the length of the input and set an appropriate max_tokens value before sending requests to the Anthropic API
Example error
400 Bad Request
{
"type": "error",
"error": {
"type": "invalid_request_error",
"message": "input length and `max_tokens` exceed context limit: 178959 + 64000 > 200000, decrease input length or `max_tokens` and try again"
},
"request_id":"req_011CUB2XUHWkFnQzxcKBbjpF"
}