Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improving the proxy docs for configuring with vllm #5184

Merged
merged 1 commit into from
Aug 14, 2024
Merged

Improving the proxy docs for configuring with vllm #5184

merged 1 commit into from
Aug 14, 2024

Conversation

fozziethebeat
Copy link

Title

Improving the proxy docs for configuring with vllm

Relevant issues

Fixes #5183 5183

Type

📖 Documentation

Changes

Small markdown changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

Skipped given just documentation changes.

Copy link

vercel bot commented Aug 13, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Aug 13, 2024 11:12pm

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@ishaan-jaff ishaan-jaff merged commit b8e5b25 into BerriAI:main Aug 14, 2024
2 checks passed
@fozziethebeat fozziethebeat deleted the fix-vllm-docs branch August 14, 2024 03:23
@ishaan-jaff
Copy link
Contributor

hi @fozziethebeat curious do you use the LiteLLM proxy in production today ?

@fozziethebeat
Copy link
Author

We actually painfully wrote our own shim that does something vaguely similar for a while but turned it down because the two models we were running don't get too much usage.

The next time I have that problem however I'd probably pick LiteLLM proxy now that I know it works so smoothly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Proxy config docs for vLLM are slightly wrong
2 participants