Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Azure, OpenAI, Palm, Anthropic, Cohere, Replicate Models - using litellm #7

Closed
wants to merge 1 commit into from

Conversation

ishaan-jaff
Copy link

This PR adds support for models from all the above mentioned providers using litellm https://github.com/BerriAI/litellm

TLDR: langstream gets:

Here's a sample of how it's used:

from litellm import completion

## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

@ishaan-jaff
Copy link
Author

addressing #5

@ishaan-jaff
Copy link
Author

@rogeriochaves can you please take a look at this PR ? If this initial commit looks good happy to add more docs/tests too😊

@rogeriochaves
Copy link
Owner

hey! Sorry for the delay, gotta fix my notifications. Great work! Thanks for the contribution, will test it more carefully and merge in a bit. Would you mind writing docs for it as well?

@rogeriochaves
Copy link
Owner

rogeriochaves commented Aug 23, 2023

I got an error when running the integration tests:

TypeError: completion() got an unexpected keyword argument 'timeout'

seems like the API is not 100% compatible with OpenAI, and thinking about that, I don't want to merge this under OpenAI, specially because the API can diverge anytime, however I like the idea of having LiteLLM being supported

could you instead of replacing the OpenAI call, simply copy and pasting and having another module for LiteLLM? I don't mind the code duplication, but I mind users not being able to trust that using the OpenAI module really uses the official OpenAI under the hood with no further wrappers. However, with the right words in documentation we can point users that this is a great way to use Azure, OpenAI, Palm, Anthropic, Cohere, Replicate Models etc all at the same time, and I really like this concept

if you don't have time to do it, I can pick it up later when I have time, but it might take a while

@rogeriochaves
Copy link
Owner

@ishaan-jaff I've implemented it in #10, take a look at what you think, and please add improvements if you can think of. Closing this one

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants