Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feat] Set OpenAI organization for litellm.completion, Proxy Config #1689

Merged
merged 7 commits into from
Jan 30, 2024

Conversation

ishaan-jaff
Copy link
Contributor

@ishaan-jaff ishaan-jaff commented Jan 30, 2024

Set OpenAI Organization for litellm.completion

Usage - Proxy Config.yaml

model_list:
  - model_name: openai-gpt-3.5
    litellm_params:
      model: openai/gpt-3.5-turbo
      api_key: sk-123
      organization: org-ikDc4ex8NB
      temperature: 0.2

Usage - litellm.completion

response = litellm.completion(
    model="gpt-3.5-turbo", messages=messages, organization="org-ikDc4ex8NB"
)

Copy link

railway-app bot commented Jan 30, 2024

This PR is being deployed to Railway 🚅

litellm: ◻️ REMOVED

Copy link

vercel bot commented Jan 30, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 30, 2024 7:47pm
litellm-dashboard ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 30, 2024 7:47pm

@ishaan-jaff ishaan-jaff merged commit dd9c788 into main Jan 30, 2024
4 of 8 checks passed
@ishaan-jaff ishaan-jaff deleted the litellm_set_organization_on_config.yaml branch January 30, 2024 19:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature]: Allow users to set organization for OpenAI on Proxy Config.yaml
1 participant