Skip to content

Litellm provider using openai model name defaults #4221

@myaple

Description

@myaple

Describe the bug

The litellm provider uses crates/goose/src/providers/formats/openai.rs as it's create_request (https://github.com/block/goose/blob/main/crates/goose/src/providers/litellm.rs#L170), but this create_request function does very specific openai processing, like here: https://github.com/block/goose/blob/main/crates/goose/src/providers/formats/openai.rs#L580

This means that if my litellm proxy has a model name that starts with "o", like "open-mistral-small-3.1", then it will send a reasoning_effort parameter regardless of whether or not it's a reasoning model.

To Reproduce
Steps to reproduce the behavior:

  1. Stand up litellm with a model that starts with "o"
  2. Send a request through goose
  3. See that it sends a reasoning_effort parameter regardless of model

I've written a quick test for this behavior that fails on the current main branch: https://github.com/myaple/goose/blob/myaple/litellm-reasoning-bug/crates/goose/src/providers/litellm.rs#L339

Expected behavior
Reasoning parameters should only be sent to be models that support reasoning. Additionally, there should be an environment variable to override the reasoning parameter for the openai models or disable it completely.

Please provide following information:

  • OS & Arch: ubuntu 24
  • Interface: cli
  • Version: 1.5.0
  • Extensions enabled: n/a
  • Provider & Model: litellm, open-mistral-small-3.1

Metadata

Metadata

Assignees

Labels

p2Priority 2 - Medium

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions