Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat (provider/togetherai): Add TogetherAI provider. #3781

Draft
wants to merge 13 commits into
base: main
Choose a base branch
from

Conversation

shaper
Copy link
Contributor

@shaper shaper commented Nov 19, 2024

Built on top of a new openai-compatible package for better code sharing as we add more top level providers that follow this pattern.

Comment on lines 18 to 19
// TODO(shaper): This is really model-specific, move to config or elsewhere?
defaultObjectGenerationMode?: 'json' | 'tool' | undefined;
Copy link
Collaborator

@lgrammel lgrammel Nov 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you make it a constructor parameter (in the options) for the model that is then defined in the providers? (which would have that knowledge)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

By options do you mean config e.g. OpenAICompatibleChatLanguageModel constructor signature is (modelId, settings, config)?

I'll move it there and see how to expose it to the concrete provider impl which today just passes a settings object (where I had it when you commented), but we could add an options or model-config arg as well.

Comment on lines 148 to 151
// TODO(shaper): Review vs. OpenAI impl here.
throw new UnsupportedFunctionalityError({
functionality: 'object-json mode',
});
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

copy what openai has including schemas

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, I omitted structuredOutputs since AFAIK it's only supported by OpenAI today.

constructor(
modelId: OpenAICompatibleChatModelId,
settings: OpenAICompatibleChatSettings,
config: OpenAICompatibleChatConfig,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

include the default object generation mode in config

Copy link
Contributor Author

@shaper shaper Nov 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. I'd like to define a type for the values 'json' | 'tool' | undefined. It appears the first reference is in LanguageModelV1. Is it appropriate to add something there?

Comment on lines 4 to 25
// TODO(shaper): Reconcile this with openai-error.ts. We derived from `xai`.

export const openAICompatibleErrorDataSchema = z.object({
code: z.string(),
error: z.string(),
});

export type OpenAICompatibleErrorData = z.infer<
typeof openAICompatibleErrorDataSchema
>;

export const openAICompatibleFailedResponseHandler =
createJsonErrorResponseHandler({
errorSchema: openAICompatibleErrorDataSchema,
errorToMessage: data => data.error,
});
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make provider-specific. different providers have different structures. have default that matches openai

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can see a couple of different ways to do this. The error data schema is further embedded in the schemas used in the chat/completion model classes, so just making the failed response handler configurable isn't sufficient for the goal.

I have in mind to break this PR into two so that we can start landing parts and work more incrementally:

  1. initial openai-compatible package work
  2. togetherai package atop it

I expect we can land (1) tomorrow after I iterate on your next feedback. We can then put a focused change together to make the error schema/response handler configurable. And we can iterate on (2) until it supports the model/feature combos we feel is good enough for a first ship target.

Copy link

New dependencies detected. Learn more about Socket for GitHub ↗︎

Package New capabilities Transitives Size Publisher
npm/@ai-sdk/provider-utils@2.0.0 environment, network +2 4.41 MB vercel-release-bot

View full report↗︎

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants