-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add together.ai endpoint support #570
Comments
IMO it's unclear if this should be put inside of
Maybe the correct way to do this is to refactor |
I think the |
I am pretty sure together REST api is openai compatible. I'm currently using it on huggingface chat-ui. |
It needs to be OAI REST compatible with function calling support - we technically already support OAI-compatible together API once #619 is merged and can be toggled on for together / or on by default (if together supported the deprecated functions style instead of tool style in the full OAI spec, then we would already support together even w/o #619). Last time I checked a few weeks ago not all their models had function calling support + the performance was worse than using raw completions + our own function calling via prompt formatting / return parsing (this may have changed since). However it's probably worth supporting both anyways and the user can choose. |
Should be very similar to vLLM support - https://docs.together.ai/docs/inference-rest
The text was updated successfully, but these errors were encountered: