Skip to content

RFC: Always set temperature to default? #1913

@neubig

Description

@neubig

Currently we set temperature to 0.0 for most models:

def get_default_temperature(model: str) -> float:

This gets overridden by litellm in some cases, such as for gpt-5: https://github.com/BerriAI/litellm/blob/d2e113111f226f1e4d1c671b196beaf7248ea4e6/litellm/llms/openai/chat/gpt_5_transformation.py

And sometimes we get errors due to temperature being set to an invalid values.

I wonder if we should just always use the default temperature for all models?

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestllmAbout LLMs.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions