Skip to main content

🔥 Logfire - Logging LLM Input/Output

Logfire is open Source Observability & Analytics for LLM Apps Detailed production traces and a granular view on quality, cost and latency

info

We want to learn how we can make the callbacks better! Meet the LiteLLM founders or join our discord

Pre-Requisites

Ensure you have installed the following packages to use this integration

pip install litellm

pip install opentelemetry-api==1.25.0
pip install opentelemetry-sdk==1.25.0
pip install opentelemetry-exporter-otlp==1.25.0

Quick Start

Get your Logfire token from Logfire

litellm.callbacks = ["logfire"]
# pip install logfire
import litellm
import os

# from https://logfire.pydantic.dev/
os.environ["LOGFIRE_TOKEN"] = ""

# LLM API Keys
os.environ['OPENAI_API_KEY']=""

# set logfire as a callback, litellm will send the data to logfire
litellm.success_callback = ["logfire"]

# openai call
response = litellm.completion(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
]
)

Support & Talk to Founders