Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable setting default model value for LiteLLM, Chat, Completions #985

Merged
merged 3 commits into from
Dec 9, 2023

Conversation

estill01
Copy link
Contributor

@estill01 estill01 commented Dec 3, 2023

add model arg to Completions class; if you provide a value, it will be used when you create new completions from an instance of the class.

add `model` arg to `Completions` class; if you provide a value, it will be used when you create new completions from an instance of the class.
Copy link

vercel bot commented Dec 3, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Dec 3, 2023 5:38am

litellm/main.py Outdated Show resolved Hide resolved
@estill01 estill01 closed this Dec 3, 2023
@estill01 estill01 reopened this Dec 3, 2023
@estill01 estill01 changed the title Enable setting default model value for Completions Enable setting default model value for LiteLLM, Chat, Completions Dec 3, 2023
@krrishdholakia
Copy link
Contributor

lgtm!

@krrishdholakia krrishdholakia merged commit cc4a1d2 into BerriAI:main Dec 9, 2023
2 checks passed
@estill01 estill01 deleted the patch-1 branch December 19, 2023 22:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants