Skip to main content

Perplexity AI (pplx-api)

https://www.perplexity.ai

API Key

# env variable
os.environ['PERPLEXITYAI_API_KEY']

Sample Usage

from litellm import completion
import os

os.environ['PERPLEXITYAI_API_KEY'] = ""
response = completion(
model="perplexity/mistral-7b-instruct",
messages=messages
)
print(response)

Sample Usage - Streaming

from litellm import completion
import os

os.environ['PERPLEXITYAI_API_KEY'] = ""
response = completion(
model="perplexity/mistral-7b-instruct",
messages=messages,
stream=True
)

for chunk in response:
print(chunk)

Supported Models

All models listed here https://docs.perplexity.ai/docs/model-cards are supported

Model NameFunction Call
pplx-7b-chatcompletion(model="perplexity/pplx-7b-chat", messages)
pplx-70b-chatcompletion(model="perplexity/pplx-70b-chat", messages)
pplx-7b-onlinecompletion(model="perplexity/pplx-7b-online", messages)
pplx-70b-onlinecompletion(model="perplexity/pplx-70b-online", messages)
codellama-34b-instructcompletion(model="perplexity/codellama-34b-instruct", messages)
llama-2-13b-chatcompletion(model="perplexity/llama-2-13b-chat", messages)
llama-2-70b-chatcompletion(model="perplexity/llama-2-70b-chat", messages)
mistral-7b-instructcompletion(model="perplexity/mistral-7b-instruct", messages)
openhermes-2-mistral-7bcompletion(model="perplexity/openhermes-2-mistral-7b", messages)
openhermes-2.5-mistral-7bcompletion(model="perplexity/openhermes-2.5-mistral-7b", messages)
pplx-7b-chat-alphacompletion(model="perplexity/pplx-7b-chat-alpha", messages)
pplx-70b-chat-alphacompletion(model="perplexity/pplx-70b-chat-alpha", messages)