-
Notifications
You must be signed in to change notification settings - Fork 144
Open
Labels
Description
Currently we set temperature to 0.0 for most models:
| def get_default_temperature(model: str) -> float: |
This gets overridden by litellm in some cases, such as for gpt-5: https://github.com/BerriAI/litellm/blob/d2e113111f226f1e4d1c671b196beaf7248ea4e6/litellm/llms/openai/chat/gpt_5_transformation.py
And sometimes we get errors due to temperature being set to an invalid values.
I wonder if we should just always use the default temperature for all models?
Reactions are currently unavailable