-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Closed
Description
Describe the bug
It looks like goose doesn't exit with the correct exit code in this situation.
❯ GOOSE_TEMPERATURE="0.3" goose
starting session | provider: databricks model: goose-gpt-5
logging to /Users/atish/.local/share/goose/sessions/20250910_154423.jsonl
working directory: /Users/atish/Development/make-repos-ai-ready
Goose is running! Enter your instructions, or try asking what goose can do.
( O)> hey
Session ended with error: Request failed: Request failed with status: 400 Bad Request. Message: {"external_model_provider":"openai","external_model_error":{"error":{"message":"Unsupported value: 'temperature' does not support 0.300...896 with this model. Only the default (1) value is supported.","type":"invalid_request_error","param":"temperature","code":"unsupported_value"}}}
❯ echo $status
0
To Reproduce
Steps to reproduce the behavior:
- Go to '...'
- Click on '....'
- Scroll down to '....'
- See error
Expected behavior
The main issue is the silent failure and wrong exit code.
Screenshots
If applicable, add screenshots to help explain your problem.
Please provide following information:
- OS & Arch: [e.g. Ubuntu 22.04 x86]
- Interface: [UI/CLI]
- Version: [e.g. v1.0.2]
- Extensions enabled: [e.g. Computer Controller, Figma]
- Provider & Model: [e.g. Google - gemini-1.5-pro]
Additional context
cc @atishpatel
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels