-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add aphrodite support #1153
base: main
Are you sure you want to change the base?
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
@AlpinDale isn't it already openai-compatible? meaning you could just call it like this - https://docs.litellm.ai/docs/providers/openai_compatible |
@AlpinDale bump on this? |
@krrishdholakia hi sorry for the late reply. I'd assume the LiteLLM OpenAI endpoint doesn't support any samplers beyond what OpenAI itself provides. Is that true? If not, I suppose we can use it as-is. |
Yea - we send across any unmapped kwargs straight to the provider - https://docs.litellm.ai/docs/completion/input#provider-specific-params. You can test this out by doing import os
from litellm import completion
os.environ["OPENAI_API_KEY"] = "your-api-key"
# openai call
response = completion(
model = "openai/<your-model-name>",
messages=[{ "content": "Hello, how are you?","role": "user"}],
api_base = "your-api-base"
) Let me know if this solves your problem! |
I don't mind adding something more specific for aphrodite, but if you have no hosted endpoint - i won't be able to add this to our ci/cd pipeline for testing. |
This PR adds support for Aphrodite Engine. WIP as this is currently untested.