Commit Graph

25 Commits

Author SHA1 Message Date
Kian Jones
3634464251 fix(core): handle anyio.BrokenResourceError for client disconnects (#9358)
Catch BrokenResourceError alongside ClosedResourceError in streaming
response, logging middleware, and app exception handlers so client
disconnects are logged at info level instead of surfacing as 500s.

Datadog: https://us5.datadoghq.com/error-tracking/issue/4f57af0c-d558-11f0-a65d-da7ad0900000

🤖 Generated with [Letta Code](https://letta.com)

Co-authored-by: Letta <noreply@letta.com>
2026-02-24 10:52:07 -08:00
cthomas
c162de5127 fix: use shared event + .athrow() to properly set stream_was_cancelle… (#9019)
fix: use shared event + .athrow() to properly set stream_was_cancelled flag

**Problem:**
When a run is cancelled via /cancel endpoint, `stream_was_cancelled` remained
False because `RunCancelledException` was raised in the consumer code (wrapper),
which closes the generator from outside. This causes Python to skip the
generator's except blocks and jump directly to finally with the wrong flag value.

**Solution:**
1. Shared `asyncio.Event` registry for cross-layer cancellation signaling
2. `cancellation_aware_stream_wrapper` sets the event when cancellation detected
3. Wrapper uses `.athrow()` to inject exception INTO generator (not consumer-side raise)
4. All streaming interfaces check event in `finally` block to set flag correctly
5. `streaming_service.py` handles `RunCancelledException` gracefully, yields [DONE]

**Changes:**
- streaming_response.py: Event registry + .athrow() injection + graceful handling
- openai_streaming_interface.py: 3 classes check event in finally
- gemini_streaming_interface.py: Check event in finally
- anthropic_*.py: Catch RunCancelledException
- simple_llm_stream_adapter.py: Create & pass event to interfaces
- streaming_service.py: Handle RunCancelledException, yield [DONE], skip double-update
- routers/v1/{conversations,runs}.py: Pass event to wrapper
- integration_test_human_in_the_loop.py: New test for approval + cancellation

**Tests:**
- test_tool_call with cancellation (OpenAI models) 
- test_approve_with_cancellation (approval flow + concurrent cancel) 

**Known cosmetic warnings (pre-existing):**
- "Run already in terminal state" - agent loop tries to update after /cancel
- "Stream ended without terminal event" - background streaming timing race

👾 Generated with [Letta Code](https://letta.com)

Co-authored-by: Letta <noreply@letta.com>
2026-01-29 12:44:04 -08:00
cthomas
6599aa3b44 feat: populate seq_id for ping messages (#8844)
* feat: populate seq_id for ping messages

* fix import
2026-01-19 15:54:43 -08:00
Matthew Zhou
dbad510a6e fix: Bound async queue during streaming (#5976)
Add maxsize = 1
2025-11-13 15:36:55 -08:00
Charles Packer
a6077f3927 fix(core): Fix agent loop continuing after cancellation in letta_agent_v3 [LET-6006] (#5905)
* Fix agent loop continuing after cancellation in letta_agent_v3

Bug: When a run is cancelled, _check_run_cancellation() sets
self.should_continue=False and returns early from _step(), but the outer
for loop (line 245) continues to the next iteration, executing subsequent
steps even though cancellation was requested.

Symptom: User hits cancel during step 1, backend marks run as cancelled,
but agent continues executing steps 2, 3, etc.

Root cause: After the 'async for chunk in response' loop completes (line 255),
there was no check of self.should_continue before continuing to the next
iteration of the outer step loop.

Fix: Added 'if not self.should_continue: break' check after the inner loop
to exit the outer step loop when cancellation is detected. This makes v3
consistent with v2 which already had this check (line 306-307).

🐾 Generated with [Letta Code](https://letta.com)

Co-authored-by: Letta <noreply@letta.com>

* add integration tests

* fix: misc fixes required to get cancellations to work on letta code localhost

---------

Co-authored-by: Letta <noreply@letta.com>
Co-authored-by: Sarah Wooders <sarahwooders@gmail.com>
2025-11-13 15:36:39 -08:00
Ari Webb
9e94c344b8 using uuid and datetime [LET-5508] (#5430)
* using uuid and datetime

* add run_id

---------

Co-authored-by: Ari Webb <ari@letta.com>
2025-10-24 15:12:11 -07:00
Sarah Wooders
354205f581 feat: create new runs table [LET-4467] (#4841) 2025-10-07 17:50:47 -07:00
Kian Jones
b8e9a80d93 merge this (#4759)
* wait I forgot to comit locally

* cp the entire core directory and then rm the .git subdir
2025-09-17 15:47:40 -07:00
Kian Jones
22f70ca07c chore: officially migrate to submodule (#4502)
* remove apps/core and apps/fern

* fix precommit

* add submodule updates in workflows

* submodule

* remove core tests

* update core revision

* Add submodules: true to all GitHub workflows

- Ensure all workflows can access git submodules
- Add submodules support to deployment, test, and CI workflows
- Fix YAML syntax issues in workflow files

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>

* remove core-lint

* upgrade core with latest main of oss

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-09 12:45:53 -07:00
Charles Packer
0d195bd2b7 fix(core): patch the error throwing for HITL [LET-4218] (#4455)
fix: patch the error throwing for HITL
2025-09-06 11:45:46 -07:00
Charles Packer
71a5eaa262 fix(core): change the backend mid-stream error packing to match what the FE expects [PRO-1107] (#4340)
fix(core): change the backend mid-stream error packing to match what the FE expects
2025-09-01 14:59:42 -07:00
Kian Jones
fecf6decfb chore: migrate to ruff (#4305)
* base requirements

* autofix

* Configure ruff for Python linting and formatting

- Set up minimal ruff configuration with basic checks (E, W, F, I)
- Add temporary ignores for common issues during migration
- Configure pre-commit hooks to use ruff with pass_filenames
- This enables gradual migration from black to ruff

* Delete sdj

* autofixed only

* migrate lint action

* more autofixed

* more fixes

* change precommit

* try changing the hook

* try this stuff
2025-08-29 11:11:19 -07:00
cthomas
0d1282a09b feat: dont swallow application errors in streaming response [LET-4069] (#4253)
* feat: dont swallow application errors in streaming response

* change error back to exception
2025-08-27 11:59:43 -07:00
cthomas
c8771d6b80 feat: catch asyncio cancellations from app in stream response (#4112) 2025-08-22 15:42:14 -07:00
cthomas
b2a68a467c feat: add resource closed errors throughout stream (#4021) 2025-08-19 16:02:04 -07:00
cthomas
b1f1d0a5bf feat: only stream last chunk if client is connected (#4015) 2025-08-19 14:46:30 -07:00
cthomas
9ec8473404 feat: catch closed resource error in stream processing (#4003) 2025-08-19 12:11:00 -07:00
cthomas
eb472dc1e0 feat: introduce asyncio shield to stream response (#3992) 2025-08-18 17:11:19 -07:00
Sarah Wooders
685bfece6d feat: treat cancellations that are non-explicit as errors (#3641) 2025-07-29 23:23:33 -07:00
Charles Packer
7348630486 feat: add keepalive wrapper to stream route (#3645)
Co-authored-by: Caren Thomas <carenthomas@gmail.com>
2025-07-29 22:53:38 -07:00
Andy Li
a3bb0b5fdf feat: ade support for showing errored messages in ade 2025-07-21 13:03:20 -07:00
Andy Li
60c4e23904 feat: stop reasons and error messages and sentry fixes 2025-07-18 11:56:20 -07:00
Andy Li
33c1f26ab6 feat: support for agent loop job cancelation (#2837) 2025-07-02 14:31:16 -07:00
Matthew Zhou
26ae9c4502 feat: Add tavily search builtin tool (#2257) 2025-05-19 16:38:11 -07:00
Andy Li
a78abc610e feat: track llm provider traces and tracking steps in async agent loop (#2219) 2025-05-19 15:50:56 -07:00