v1.55.0.dev2
版本发布时间: 2024-12-13 11:00:59
BerriAI/litellm最新发布版本:v1.56.9(2025-01-04 10:46:47)
What's Changed
- (feat) add
response_time
to StandardLoggingPayload - logged ondatadog
,gcs_bucket
,s3_bucket
etc by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/7199 - build(deps): bump nanoid from 3.3.7 to 3.3.8 in /ui by @dependabot in https://github.com/BerriAI/litellm/pull/7198
- (Feat) DataDog Logger - Add
HOSTNAME
andPOD_NAME
to DataDog logs by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/7189 - (feat) add
error_code
,error_class
,llm_provider
toStandardLoggingPayload
by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/7200 - (docs) Document StandardLoggingPayload Spec by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/7201
- fix: Support WebP image format and avoid token calculation error by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/7182
- (feat) UI - Disable Usage Tab once SpendLogs is 1M+ Rows by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/7208
Full Changelog: https://github.com/BerriAI/litellm/compare/v1.55.0...v1.55.0.dev2
Docker Run LiteLLM Proxy
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.55.0.dev2
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 210.0 | 236.69042419128075 | 6.133942906309277 | 0.0 | 1835 | 0 | 175.69668400000182 | 4096.7015589999955 |
Aggregated | Passed ✅ | 210.0 | 236.69042419128075 | 6.133942906309277 | 0.0 | 1835 | 0 | 175.69668400000182 | 4096.7015589999955 |
1、 load_test.html 1.59MB
2、 load_test_stats.csv 540B