fix(core): handle Anthropic overloaded errors and Unicode encoding issues (#9305)

* fix: handle Anthropic overloaded_error in streaming interfaces

* fix: handle Unicode surrogates in OpenAI requests

Sanitize Unicode surrogate pairs before sending requests to OpenAI API.
Surrogate pairs (U+D800-U+DFFF) are UTF-16 encoding artifacts that cause
UnicodeEncodeError when encoding to UTF-8.

Fixes Datadog error: 'utf-8' codec can't encode character '\ud83c' in
position 326605: surrogates not allowed

* fix: handle UnicodeEncodeError from lone Unicode surrogates in OpenAI requests

Improved sanitize_unicode_surrogates() to explicitly filter out lone
surrogate characters (U+D800 to U+DFFF) which are invalid in UTF-8.

Previous implementation used errors='ignore' which could still fail in
edge cases. New approach directly checks Unicode code points and removes
any surrogates before data reaches httpx encoding.

Also added sanitization to stream_async_responses() method which was
missing it.

Fixes: 'utf-8' codec can't encode character '\ud83c' in position X:
surrogates not allowed
This commit is contained in:
Kian Jones
2026-02-05 16:00:36 -08:00
committed by Caren Thomas
parent 93249b96f5
commit 6f746c5225
12 changed files with 98 additions and 36 deletions

View File

@@ -27,6 +27,7 @@ from letta.errors import (
LLMTimeoutError,
LLMUnprocessableEntityError,
)
from letta.helpers.json_helpers import sanitize_unicode_surrogates
from letta.llm_api.error_utils import is_context_window_overflow_message
from letta.llm_api.helpers import (
add_inner_thoughts_to_functions,
@@ -587,6 +588,9 @@ class OpenAIClient(LLMClientBase):
"""
Performs underlying synchronous request to OpenAI API and returns raw response dict.
"""
# Sanitize Unicode surrogates to prevent encoding errors
request_data = sanitize_unicode_surrogates(request_data)
client = OpenAI(**self._prepare_client_kwargs(llm_config))
# Route based on payload shape: Responses uses 'input', Chat Completions uses 'messages'
if "input" in request_data and "messages" not in request_data:
@@ -601,6 +605,9 @@ class OpenAIClient(LLMClientBase):
"""
Performs underlying asynchronous request to OpenAI API and returns raw response dict.
"""
# Sanitize Unicode surrogates to prevent encoding errors
request_data = sanitize_unicode_surrogates(request_data)
kwargs = await self._prepare_client_kwargs_async(llm_config)
client = AsyncOpenAI(**kwargs)
# Route based on payload shape: Responses uses 'input', Chat Completions uses 'messages'
@@ -805,6 +812,9 @@ class OpenAIClient(LLMClientBase):
"""
Performs underlying asynchronous streaming request to OpenAI and returns the async stream iterator.
"""
# Sanitize Unicode surrogates to prevent encoding errors
request_data = sanitize_unicode_surrogates(request_data)
kwargs = await self._prepare_client_kwargs_async(llm_config)
client = AsyncOpenAI(**kwargs)
@@ -836,6 +846,9 @@ class OpenAIClient(LLMClientBase):
"""
Performs underlying asynchronous streaming request to OpenAI and returns the async stream iterator.
"""
# Sanitize Unicode surrogates to prevent encoding errors
request_data = sanitize_unicode_surrogates(request_data)
kwargs = await self._prepare_client_kwargs_async(llm_config)
client = AsyncOpenAI(**kwargs)
response_stream: AsyncStream[ResponseStreamEvent] = await client.responses.create(**request_data, stream=True)