Skip to main content

One post tagged with "embeddings"

View All Tags

Incident Report: vLLM Embeddings Broken by encoding_format Parameter

Sameer Kankute
SWE @ LiteLLM (LLM Translation)
Krrish Dholakia
CEO, LiteLLM
Ishaan Jaff
CTO, LiteLLM

Date: Feb 16, 2026 Duration: ~3 hours Severity: High (for vLLM embedding users) Status: Resolved

Summary​

A commit (dbcae4a) intended to fix OpenAI SDK behavior broke vLLM embeddings by explicitly passing encoding_format=None in API requests. vLLM rejects this with error: "unknown variant \`, expected float or base64"`.

  • vLLM embedding calls: Complete failure - all requests rejected
  • Other providers: No impact - OpenAI and other providers functioned normally
  • Other vLLM functionality: No impact - only embeddings were affected