Skip to main content

Using Web Search

Use web search with litellm

FeatureDetails
Supported Endpoints- /chat/completions
- /responses
Supported Providersopenai, xai, vertex_ai, anthropic, gemini, perplexity
LiteLLM Cost Tracking✅ Supported
LiteLLM Versionv1.71.0+

Which Search Engine is Used?​

Each provider uses their own search backend:

ProviderSearch EngineNotes
OpenAI (gpt-4o-search-preview)OpenAI's internal searchReal-time web data
xAI (grok-3)xAI's search + X/TwitterReal-time social media data
Google AI/Vertex (gemini-2.0-flash)Google SearchUses actual Google search results
Anthropic (claude-3-5-sonnet)Anthropic's web searchReal-time web data
PerplexityPerplexity's search engineAI-powered search and reasoning
info

Anthropic Web Search Models: Claude models that support web search: claude-3-5-sonnet-latest, claude-3-5-sonnet-20241022, claude-3-5-haiku-latest, claude-3-5-haiku-20241022, claude-3-7-sonnet-20250219

/chat/completions (litellm.completion)​

Quick Start​

from litellm import completion

response = completion(
model="openai/gpt-4o-search-preview",
messages=[
{
"role": "user",
"content": "What was a positive news story from today?",
}
],
web_search_options={
"search_context_size": "medium" # Options: "low", "medium", "high"
}
)

Search context size​

OpenAI (using web_search_options)

from litellm import completion

# Customize search context size
response = completion(
model="openai/gpt-4o-search-preview",
messages=[
{
"role": "user",
"content": "What was a positive news story from today?",
}
],
web_search_options={
"search_context_size": "low" # Options: "low", "medium" (default), "high"
}
)

xAI (using web_search_options)

from litellm import completion

# Customize search context size for xAI
response = completion(
model="xai/grok-3",
messages=[
{
"role": "user",
"content": "What was a positive news story from today?",
}
],
web_search_options={
"search_context_size": "high" # Options: "low", "medium" (default), "high"
}
)

Anthropic (using web_search_options)

from litellm import completion

# Customize search context size for Anthropic
response = completion(
model="anthropic/claude-3-5-sonnet-latest",
messages=[
{
"role": "user",
"content": "What was a positive news story from today?",
}
],
web_search_options={
"search_context_size": "medium", # Options: "low", "medium" (default), "high"
"user_location": {
"type": "approximate",
"approximate": {
"city": "San Francisco",
},
}
}
)

VertexAI/Gemini (using web_search_options)

from litellm import completion

# Customize search context size for Gemini
response = completion(
model="gemini-2.0-flash",
messages=[
{
"role": "user",
"content": "What was a positive news story from today?",
}
],
web_search_options={
"search_context_size": "low" # Options: "low", "medium" (default), "high"
}
)

/responses (litellm.responses)​

Quick Start​

from litellm import responses

response = responses(
model="openai/gpt-4o",
input=[
{
"role": "user",
"content": "What was a positive news story from today?"
}
],
tools=[{
"type": "web_search_preview" # enables web search with default medium context size
}]
)

Search context size​

from litellm import responses

# Customize search context size
response = responses(
model="openai/gpt-4o",
input=[
{
"role": "user",
"content": "What was a positive news story from today?"
}
],
tools=[{
"type": "web_search_preview",
"search_context_size": "low" # Options: "low", "medium" (default), "high"
}]
)

Configuring Web Search in config.yaml​

You can set default web search options directly in your proxy config file:

model_list:
# Enable web search by default for all requests to this model
- model_name: grok-3
litellm_params:
model: xai/grok-3
api_key: os.environ/XAI_API_KEY
web_search_options: {} # Enables web search with default settings

Note: When web_search_options is set in the config, it applies to all requests to that model. Users can still override these settings by passing web_search_options in their API requests.

Use litellm.supports_web_search(model="model_name") -> returns True if model can perform web searches

# Check OpenAI models
assert litellm.supports_web_search(model="openai/gpt-4o-search-preview") == True

# Check xAI models
assert litellm.supports_web_search(model="xai/grok-3") == True

# Check Anthropic models
assert litellm.supports_web_search(model="anthropic/claude-3-5-sonnet-latest") == True

# Check VertexAI models
assert litellm.supports_web_search(model="gemini-2.0-flash") == True

# Check Google AI Studio models
assert litellm.supports_web_search(model="gemini/gemini-2.0-flash") == True