-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatOpenAI with o3-mini fails when pulling a prompt with model from Langsmith using pull
#7648
Comments
Hey @nikhilshinday! I'm here to help you with any bugs, questions, or contributions you have in mind. Let me know how I can assist you! The error you're encountering is due to the To resolve this issue, you should modify the code to use To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
@dosu there is no way to specify max_completion_tokens in the LC hub interface, the only variable you can specify is
|
To configure the LC hub interface to use Here's an example of how you can instantiate the import { ChatCerebras } from '@langchain/cerebras';
const llm = new ChatCerebras({
model: "o3-mini",
maxCompletionTokens: 100, // Set your desired max completion tokens here
temperature: 0.7,
// other params...
}); This will configure the model to use To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Oof, thanks for flagging - will coordinate with @madams0013 and fix. |
@jacoblee93 and @madams0013, just following up here - assuming that this won't be updated in the SDK/platform for a while, is there a workaround for the meantime? |
Hey sorry for losing track of this, will dig in today! |
Just pushed |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
The ChatOpenAI model does not support o3-mini when pulled from langchain hub because the model spec is sending parameters that are not supported by the stated model.
System Info
The text was updated successfully, but these errors were encountered: