Skip to main content

Troubleshooting & Support

Information to Provide When Seeking Help​

When reporting issues, please include as much of the following as possible. It's okay if you can't provide everythingβ€”especially in production scenarios where the trigger might be unknown. Sharing most of this information will help us assist you more effectively.

1. LiteLLM Configuration File​

Your config.yaml file (redact sensitive info like API keys). Include number of workers if not in config.

2. Initialization Command​

The command used to start LiteLLM (e.g., litellm --config config.yaml --num_workers 8 --detailed_debug).

3. LiteLLM Version​

  • Current version
  • Version when the issue first appeared (if different)
  • If upgraded, the version changed from β†’ to

4. Environment Variables​

Non-sensitive environment variables not in your config (e.g., NUM_WORKERS, LITELLM_LOG, LITELLM_MODE). Do not include passwords or API keys.

5. Server Specifications​

CPU cores, RAM, OS, number of instances/replicas, etc.

6. Database and Redis Usage​

  • Database: Using database? (DATABASE_URL set), database type and version
  • Redis: Using Redis? Redis version, configuration type (Standalone/Cluster/Sentinel).

7. Endpoints​

The endpoint(s) you're using that are experiencing issues (e.g., /chat/completions, /embeddings).

8. Request Example​

A realistic example of the request causing issues, including expected vs. actual response and any error messages.

9. Error Logs, Stack Traces, and Metrics​

Full error logs, stack traces, and any images from service metrics (CPU, memory, request rates, etc.) that might help diagnose the issue.


UI Issues​

If you're experiencing issues with the LiteLLM Admin UI, please include the following information in addition to the general details above.

1. Steps to Reproduce​

A clear, step-by-step description of how to trigger the issue (e.g., "Navigate to Settings β†’ Team, click 'Create Team', fill in fields, click submit β†’ error appears").

2. LiteLLM Version​

The current version of LiteLLM you're running. Check via litellm --version or the UI's settings page.

3. Architecture & Deployment Setup​

Distributed environments are a known source of UI issues. Please describe:

  • Number of LiteLLM instances/replicas and how they are deployed (e.g., Kubernetes, Docker Compose, ECS)
  • Load balancer type and configuration (e.g., ALB, Nginx, Cloudflare Tunnel) β€” include whether sticky sessions are enabled
  • How the UI is accessed β€” directly via LiteLLM, through a reverse proxy, or behind an ingress controller
  • Any CDN or caching layers between the user and the LiteLLM server

4. Network Tab Requests​

Open your browser's Developer Tools (F12 β†’ Network tab), reproduce the issue, and share:

  • The failing request(s) β€” URL, method, status code, and response body
  • Screenshots or HAR export of the relevant network activity
  • Any CORS or mixed-content errors shown in the Console tab

5. Environment Variables​

Non-sensitive environment variables related to the UI and proxy setup, such as:

  • LITELLM_MASTER_KEY
  • PROXY_BASE_URL / LITELLM_PROXY_BASE_URL
  • UI_BASE_PATH
  • Any SSO-related variables (e.g., GOOGLE_CLIENT_ID, MICROSOFT_TENANT)

Do not include passwords, secrets, or API keys.

6. Browser & Access Details​

  • Browser and version (e.g., Chrome 120, Firefox 121)
  • Access URL used to reach the UI (redact sensitive parts)
  • Whether the issue occurs for all users or specific roles (Admin, Internal User, etc.)

7. Screenshots or Screen Recordings​

A screenshot or short screen recording of the issue is extremely helpful. Include any visible error messages, toasts, or unexpected behavior.


Support Channels​

Schedule Demo πŸ‘‹

Community Discord πŸ’­ Community Slack πŸ’­

Our numbers πŸ“ž +1 (770) 8783-106 / +1 (412) 618-6238

Our emails βœ‰οΈ ishaan@berri.ai / krrish@berri.ai

Chat on WhatsApp Chat on Discord