-
Notifications
You must be signed in to change notification settings - Fork 274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Azure OpenAI support #77
Comments
+1. |
openai doesn't use API requests for AI training, but nevertheless Azure OpenAI could help in bringing down latency |
I submit a pull request to add Azure OpenAI support: #96 |
Hi @Delatorrea @shikelong @wwydmanski I believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm TLDR: You can use LiteLLM in the following ways: With your own API KEY:This calls the provider API directly from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-key" #
os.environ["COHERE_API_KEY"] = "your-key" #
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion(model="command-nightly", messages=messages) Using the LiteLLM Proxy with a LiteLLM Keythis is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion(model="command-nightly", messages=messages) |
Support for Azure OpenAI could be added, with this feature the data is not used for AI training, ideal for us to use at work, avoiding information leakage.
The text was updated successfully, but these errors were encountered: