You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error thrown when trying to invoke the new openai o1 model series: BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}
@pollackscience Unfortunately, I'm not able to reproduce this as I lack a tier-5 usage account necessary to use the latest OpenAI o1 models.
I've searched through our codebase and don't see any mention of us setting a default value for the temperature parameter, so I'm not sure why temperature is being set to 0.7 in your environment. This could be an upstream API issue given that it is a very recent release.
For now, can you try using the --model-parameters argument to explicitly set the temperature to 1?
@dlqqq the explicit setting of temp to 1 does work! Thank you for your quick response. One item to note:
input must be {"temperature":1} (no space between ":" and "1") or else it throws a json.loads error.
Not sure if you want to mark this issue as closed, but it's a simple and effective workaround. Thanks!
Description
Error thrown when trying to invoke the new openai o1 model series:
BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}
Reproduce
%config AiMagics.default_language_model = "openai-chat:o1-preview"
%%ai What is 5+8?
Expected behavior
Ideally this model can be run without any additional modification to configs or core package code.
Context
The text was updated successfully, but these errors were encountered: