v1.43.7
·
1 commit
to 19bb95f781872df196ac069f88ddffa092868b83
since this release
What's Changed
- [Refactor+Testing] Refactor Prometheus metrics to use CustomLogger class + add testing for prometheus by @ishaan-jaff in #5149
- fix(main.py): safely fail stream_chunk_builder calls by @krrishdholakia in #5151
- Feat - track response latency on prometheus by @ishaan-jaff in #5152
- Feat - Proxy track fallback metrics on prometheus by @ishaan-jaff in #5153
Full Changelog: v1.43.6...v1.43.7
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.43.7
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 82 | 93.54444385099963 | 6.525127678045322 | 0.0 | 1953 | 0 | 71.26952499999106 | 701.7027350000262 |
Aggregated | Passed ✅ | 82 | 93.54444385099963 | 6.525127678045322 | 0.0 | 1953 | 0 | 71.26952499999106 | 701.7027350000262 |