Open WebUI with LiteLLM
This guide walks you through connecting Open WebUI to LiteLLM. Using LiteLLM with Open WebUI allows teams to
- Access 100+ LLMs on Open WebUI
- Track Spend / Usage, Set Budget Limits
- Send Request/Response Logs to logging destinations like langfuse, s3, gcs buckets, etc.
- Set access controls eg. Control what models Open WebUI can access.
Quickstartโ
- Make sure to setup LiteLLM with the LiteLLM Getting Started Guide
1. Start LiteLLM & Open WebUIโ
- Open WebUI starts running on http://localhost:3000
- LiteLLM starts running on http://localhost:4000
2. Create a Virtual Key on LiteLLMโ
Virtual Keys are API Keys that allow you to authenticate to LiteLLM Proxy. We will create a Virtual Key that will allow Open WebUI to access LiteLLM.
2.1 LiteLLM User Management Hierarchyโ
On LiteLLM, you can create Organizations, Teams, Users and Virtual Keys. For this tutorial, we will create a Team and a Virtual Key.
Organization
- An Organization is a group of Teams. (US Engineering, EU Developer Tools)Team
- A Team is a group of Users. (Open WebUI Team, Data Science Team, etc.)User
- A User is an individual user (employee, developer, eg.krrish@litellm.ai
)Virtual Key
- A Virtual Key is an API Key that allows you to authenticate to LiteLLM Proxy. A Virtual Key is associated with a User or Team.
Once the Team is created, you can invite Users to the Team. You can read more about LiteLLM's User Management here.
2.2 Create a Team on LiteLLMโ
Navigate to http://localhost:4000/ui and create a new team.

2.2 Create a Virtual Key on LiteLLMโ
Navigate to http://localhost:4000/ui and create a new virtual Key.
LiteLLM allows you to specify what models are available on Open WebUI (by specifying the models the key will have access to).

3. Connect Open WebUI to LiteLLMโ
On Open WebUI, navigate to Settings -> Connections and create a new connection to LiteLLM
Enter the following details:
- URL:
http://localhost:4000
(your litellm proxy base url) - Key:
your-virtual-key
(the key you created in the previous step)

3.1 Test Requestโ
On the top left corner, select models you should only see the models you gave the key access to in Step 2.
Once you selected a model, enter your message content and click on Submit

3.2 Tracking Usage & Spendโ
Basic Trackingโ
After making requests, navigate to the Logs
section in the LiteLLM UI to view Model, Usage and Cost information.
Per-User Trackingโ
To track spend and usage for each Open WebUI user, configure both Open WebUI and LiteLLM:
- Enable User Info Headers in Open WebUI
Set the following environment variable for Open WebUI to enable user information in request headers:
ENABLE_FORWARD_USER_INFO_HEADERS=True
For more details, see the Environment Variable Configuration Guide.
- Configure LiteLLM to Parse User Headers
Add the following to your LiteLLM config.yaml
to specify a header to use for user tracking:
general_settings:
user_header_name: X-OpenWebUI-User-Id
โ Available tracking options
You can use any of the following headers for user_header_name
:
X-OpenWebUI-User-Id
X-OpenWebUI-User-Email
X-OpenWebUI-User-Name
These may offer better readability and easier mental attribution when hosting for a small group of users that you know well.
Choose based on your needs, but note that in Open WebUI:
- Users can modify their own usernames
- Administrators can modify both usernames and emails of any account
Render thinking
content on Open WebUIโ
Open WebUI requires reasoning/thinking content to be rendered with <think></think>
tags. In order to render this for specific models, you can use the merge_reasoning_content_in_choices
litellm parameter.
Example litellm config.yaml:
model_list:
- model_name: thinking-anthropic-claude-3-7-sonnet
litellm_params:
model: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
thinking: {"type": "enabled", "budget_tokens": 1024}
max_tokens: 1080
merge_reasoning_content_in_choices: true
Test it on Open WebUIโ
On the models dropdown select thinking-anthropic-claude-3-7-sonnet

Additional Resourcesโ
- Running LiteLLM and Open WebUI on Windows Localhost: A Comprehensive Guide https://www.tanyongsheng.com/note/running-litellm-and-openwebui-on-windows-localhost-a-comprehensive-guide/
Add Custom Headers to Spend Trackingโ
You can add custom headers to the request to track spend and usage.
litellm_settings:
extra_spend_tag_headers:
- "x-custom-header"
You can add custom headers to the request to track spend and usage.