Incident Report: vLLM Embeddings Broken by encoding_format Parameter
Date: Feb 16, 2026 Duration: ~3 hours Severity: High (for vLLM embedding users) Status: Resolved
Summary​
A commit (dbcae4a) intended to fix OpenAI SDK behavior broke vLLM embeddings by explicitly passing encoding_format=None in API requests. vLLM rejects this with error: "unknown variant \`, expected float or base64"`.
- vLLM embedding calls: Complete failure - all requests rejected
- Other providers: No impact - OpenAI and other providers functioned normally
- Other vLLM functionality: No impact - only embeddings were affected


