-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(community): AILanguageModel interface #7025
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Skipped Deployment
|
prompt(input: AILanguageModelPromptInput, options?: AILanguageModelPromptOptions): Promise<String>; | ||
promptStreaming(input: AILanguageModelPromptInput, options?: AILanguageModelPromptOptions): ReadableStream; | ||
|
||
countPromptTokens(input: AILanguageModelPromptInput, options?: AILanguageModelPromptOptions): Promise<number>; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have updated the interface to the latest definition, we may implement these if it make sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given this will likely change again, is it worth adding all of this if we're not using it?
Would prefer to keep it minimal, types will likely come from elsewhere eventually
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Eh it's experimental I suppose so it's fine
Thank you! |
Co-authored-by: jacoblee93 <jacoblee93@gmail.com>
Using latest interface definitions from prompt-api spec.