A new LiteLLM Stable release just went out. Here are 5 updates since v1.52.2-stable.
langfuse
, fallbacks
, new models
, azure_storage
Langfuse Prompt Managementโ
This makes it easy to run experiments or change the specific models gpt-4o
to gpt-4o-mini
on Langfuse, instead of making changes in your applications. Start here
Control fallback prompts client-sideโ
Claude prompts are different than OpenAI
Pass in prompts specific to model when doing fallbacks. Start here
New Providers / Modelsโ
- NVIDIA Triton
/infer
endpoint. Start here - Infinity Rerank Models Start here
โจ Azure Data Lake Storage Supportโ
Send LLM usage (spend, tokens) data to Azure Data Lake. This makes it easy to consume usage data on other services (eg. Databricks) Start here
Docker Run LiteLLMโ
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.55.8-stable
Get Daily Updatesโ
LiteLLM ships new releases every day. Follow us on LinkedIn to get daily updates.