Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add together.ai endpoint support #570

Closed
sarahwooders opened this issue Dec 4, 2023 · 5 comments
Closed

Add together.ai endpoint support #570

sarahwooders opened this issue Dec 4, 2023 · 5 comments

Comments

@sarahwooders
Copy link
Collaborator

Should be very similar to vLLM support - https://docs.together.ai/docs/inference-rest

@cpacker
Copy link
Collaborator

cpacker commented Dec 9, 2023

IMO it's unclear if this should be put inside of local_llm or openai_tools.py:

  • openai_tools.py:
    • This is where all the "paid hosted" endpoints are currently configured
    • afaik together.ai has no self-host option, so it fits better with openai + azure
    • you need a together API KEY, so it naturally is just another credential in a credentials file
  • local_llm:
    • together is not openai compatible, and the custom REST integrations we already (eg vLLM) have look similar to what we'd add for together's /inference endpoint

Maybe the correct way to do this is to refactor local_llm to be called llm_backends, and move openai and azure as separate files inside of it (though this seems like a lot of refactoring just to add one API connector).

@sarahwooders
Copy link
Collaborator Author

I think the local_llm approach you have in #596 makes more sense given it's not OpenAI compatible (is there some other REST API standard...?)

@gururise
Copy link

gururise commented Jan 5, 2024

I am pretty sure together REST api is openai compatible. I'm currently using it on huggingface chat-ui.

@cpacker
Copy link
Collaborator

cpacker commented Jan 5, 2024

I am pretty sure together REST api is openai compatible. I'm currently using it on huggingface chat-ui.

It needs to be OAI REST compatible with function calling support - we technically already support OAI-compatible together API once #619 is merged and can be toggled on for together / or on by default (if together supported the deprecated functions style instead of tool style in the full OAI spec, then we would already support together even w/o #619). Last time I checked a few weeks ago not all their models had function calling support + the performance was worse than using raw completions + our own function calling via prompt formatting / return parsing (this may have changed since). However it's probably worth supporting both anyways and the user can choose.

@mattzh72
Copy link
Collaborator

#2045

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants