Skip to main content

[BETA] /v1/messages

LiteLLM provides a BETA endpoint in the spec of Anthropic's /v1/messages endpoint.

This currently just supports the Anthropic API.

FeatureSupportedNotes
Cost Tracking✅
Logging✅works across all integrations
End-user Tracking✅
Streaming✅
Fallbacks✅between anthropic models
Loadbalancing✅between anthropic models

Planned improvement:

  • Vertex AI Anthropic support
  • Bedrock Anthropic support

Usage​

  1. Setup config.yaml
model_list:
- model_name: anthropic-claude
litellm_params:
model: claude-3-7-sonnet-latest
  1. Start proxy
litellm --config /path/to/config.yaml
  1. Test it!
curl -L -X POST 'http://0.0.0.0:4000/v1/messages' \
-H 'content-type: application/json' \
-H 'x-api-key: $LITELLM_API_KEY' \
-H 'anthropic-version: 2023-06-01' \
-d '{
"model": "anthropic-claude",
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "List 5 important events in the XIX century"
}
]
}
],
"max_tokens": 4096
}'