Troubleshooting & Support
Information to Provide When Seeking Helpβ
When reporting issues, please include as much of the following as possible. It's okay if you can't provide everythingβespecially in production scenarios where the trigger might be unknown. Sharing most of this information will help us assist you more effectively.
1. LiteLLM Configuration Fileβ
Your config.yaml file (redact sensitive info like API keys). Include number of workers if not in config.
2. Initialization Commandβ
The command used to start LiteLLM (e.g., litellm --config config.yaml --num_workers 8 --detailed_debug).
3. LiteLLM Versionβ
- Current version
- Version when the issue first appeared (if different)
- If upgraded, the version changed from β to
4. Environment Variablesβ
Non-sensitive environment variables not in your config (e.g., NUM_WORKERS, LITELLM_LOG, LITELLM_MODE). Do not include passwords or API keys.
5. Server Specificationsβ
CPU cores, RAM, OS, number of instances/replicas, etc.
6. Database and Redis Usageβ
- Database: Using database? (
DATABASE_URLset), database type and version - Redis: Using Redis? Redis version, configuration type (Standalone/Cluster/Sentinel).
7. Endpointsβ
The endpoint(s) you're using that are experiencing issues (e.g., /chat/completions, /embeddings).
8. Request Exampleβ
A realistic example of the request causing issues, including expected vs. actual response and any error messages.
9. Error Logs, Stack Traces, and Metricsβ
Full error logs, stack traces, and any images from service metrics (CPU, memory, request rates, etc.) that might help diagnose the issue.
UI Issuesβ
If you're experiencing issues with the LiteLLM Admin UI, please include the following information in addition to the general details above.
1. Steps to Reproduceβ
A clear, step-by-step description of how to trigger the issue (e.g., "Navigate to Settings β Team, click 'Create Team', fill in fields, click submit β error appears").
2. LiteLLM Versionβ
The current version of LiteLLM you're running. Check via litellm --version or the UI's settings page.
3. Architecture & Deployment Setupβ
Distributed environments are a known source of UI issues. Please describe:
- Number of LiteLLM instances/replicas and how they are deployed (e.g., Kubernetes, Docker Compose, ECS)
- Load balancer type and configuration (e.g., ALB, Nginx, Cloudflare Tunnel) β include whether sticky sessions are enabled
- How the UI is accessed β directly via LiteLLM, through a reverse proxy, or behind an ingress controller
- Any CDN or caching layers between the user and the LiteLLM server
4. Network Tab Requestsβ
Open your browser's Developer Tools (F12 β Network tab), reproduce the issue, and share:
- The failing request(s) β URL, method, status code, and response body
- Screenshots or HAR export of the relevant network activity
- Any CORS or mixed-content errors shown in the Console tab
5. Environment Variablesβ
Non-sensitive environment variables related to the UI and proxy setup, such as:
LITELLM_MASTER_KEYPROXY_BASE_URL/LITELLM_PROXY_BASE_URLUI_BASE_PATH- Any SSO-related variables (e.g.,
GOOGLE_CLIENT_ID,MICROSOFT_TENANT)
Do not include passwords, secrets, or API keys.
6. Browser & Access Detailsβ
- Browser and version (e.g., Chrome 120, Firefox 121)
- Access URL used to reach the UI (redact sensitive parts)
- Whether the issue occurs for all users or specific roles (Admin, Internal User, etc.)
7. Screenshots or Screen Recordingsβ
A screenshot or short screen recording of the issue is extremely helpful. Include any visible error messages, toasts, or unexpected behavior.
Support Channelsβ
Community Discord π Community Slack π
Our numbers π +1 (770) 8783-106 / +1 (412) 618-6238
Our emails βοΈ ishaan@berri.ai / krrish@berri.ai