Files
letta-server/letta/llm_api
jnjpng 778f28ccf3 fix: handle transient network errors in ChatGPT OAuth client (#9462)
- Map httpx.ReadError/WriteError/ConnectError to LLMConnectionError in
  handle_llm_error so Temporal correctly classifies them as retryable
  (previously fell through to generic non-retryable LLMError)
- Add client-level retry with exponential backoff (up to 3 attempts) on
  request_async and stream_async for transient transport errors
- Stream retry is guarded by has_yielded flag to avoid corrupting
  partial responses already consumed by the caller
2026-02-24 10:52:07 -08:00
..
2025-09-17 15:47:40 -07:00
2025-09-17 15:47:40 -07:00
2026-01-29 12:44:04 -08:00
2026-02-24 10:52:07 -08:00