-
Notifications
You must be signed in to change notification settings - Fork 1.2k
feat(providers): sambanova updated to use LiteLLM openai-compat #1596
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(providers): sambanova updated to use LiteLLM openai-compat #1596
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You have listed vision models in the supported models for sambanova also. Can your test plan show the outputs for executing the entire integration suite (pytest -s -v tests/integration
) please?
Hi @ashwinb. I'll let one comment for each test plan here, 3 test are failing
|
Finally, remote::sambanova doesn't support embedding model yet |
the issue was solved on PR 1150 |
cc @snova-luiss |
does it support structured outputs with other schemas? when do you expect this feature to be available? |
in that case, we should download the image in the sambanova adapter, encode it and send it as base64 downstream. we do this in a couple places (I believe maybe in the |
Hi @ashwinb, could you please take a look |
@ehhuang this PR was already modifying sambanova to use lite llm + some extra modifications to include support for things like api_key, and base url setting, support for url images, etc. this was done because current version was broken for several functionalities like streaming or tool call, should I include these changes to the new folder you have created in remote providers (opeaai compat)?, or can this be merged in parallel, given the template is still pointing to the previous folder? |
@jhpiedrahitao Hey sorry about keeping this so stale. Could you resolve conflicts one last time? I will merge it after that. |
Hi @ashwinb thanks, no problem, conflicts solved 👍🏻 |
Hi @ashwinb were you able to take a look in this one? |
Hi @ashwinb tagging you again to check if this can be merged |
…astack#1596) # What does this PR do? switch sambanova inference adaptor to LiteLLM usage to simplify integration and solve issues with current adaptor when streaming and tool calling, models and templates updated ## Test Plan pytest -s -v tests/integration/inference/test_text_inference.py --stack-config=sambanova --text-model=sambanova/Meta-Llama-3.3-70B-Instruct pytest -s -v tests/integration/inference/test_vision_inference.py --stack-config=sambanova --vision-model=sambanova/Llama-3.2-11B-Vision-Instruct
What does this PR do?
switch sambanova inference adaptor to LiteLLM usage to simplify integration and solve issues with current adaptor when streaming and tool calling, models and templates updated
Test Plan
pytest -s -v tests/integration/inference/test_text_inference.py --stack-config=sambanova --text-model=sambanova/Meta-Llama-3.3-70B-Instruct
pytest -s -v tests/integration/inference/test_vision_inference.py --stack-config=sambanova --vision-model=sambanova/Llama-3.2-11B-Vision-Instruct