v1.44.25
·
4 commits
to 3e34edcff3f22db14239b7a87a9501dd237036a7
since this release
What's Changed
- Bump send and express in /docs/my-website by @dependabot in #5626
- Bump serve-static and express in /docs/my-website by @dependabot in #5628
- Bump body-parser and express in /docs/my-website by @dependabot in #5629
- docs: update ai21 docs by @miri-bar in #5631
- Add gemini 1.5 flash exp 0827 by @BabyChouSr in #5636
- LiteLLM Minor Fixes and Improvements (09/10/2024) by @krrishdholakia in #5618
- Add the option to specify a schema via env variable by @steffen-sbt in #5640
- [Langsmith Perf Improvement] Use /batch for Langsmith Logging by @ishaan-jaff in #5638
- [Fix-Perf] OTEL use sensible default values for logging by @ishaan-jaff in #5642
- [Feat] Add Load Testing for Langsmith, and OTEL logging by @ishaan-jaff in #5646
New Contributors
- @miri-bar made their first contribution in #5631
- @BabyChouSr made their first contribution in #5636
- @steffen-sbt made their first contribution in #5640
Full Changelog: v1.44.24...v1.44.25
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.44.25
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 150.0 | 192.5806025932824 | 6.267308882379199 | 0.0 | 1876 | 0 | 119.9695099999758 | 3057.244871999984 |
Aggregated | Passed ✅ | 150.0 | 192.5806025932824 | 6.267308882379199 | 0.0 | 1876 | 0 | 119.9695099999758 | 3057.244871999984 |