Andy Li
|
734680db81
|
feat: timeout configuration for LLM clients + vertex (#2972)
|
2025-06-23 16:55:23 -07:00 |
|
Kevin Lin
|
65530e8380
|
fix: add exceptions to accept_developer_role (#2848)
|
2025-06-16 15:14:40 -07:00 |
|
Kevin Lin
|
93c15244ab
|
feat: add reasoning models to integration_test_send_message (#2710)
|
2025-06-13 14:54:37 -07:00 |
|
cthomas
|
c1255dc9d1
|
feat: make tool calls required for model proxy (#2756)
|
2025-06-11 11:36:45 -07:00 |
|
cthomas
|
5ecd8a706c
|
fix: parallel tool calling OpenAI (#2738)
|
2025-06-10 14:27:01 -07:00 |
|
cthomas
|
b332ebfa85
|
feat: support multi content part input (#2717)
|
2025-06-10 13:36:17 -07:00 |
|
Matthew Zhou
|
b53be62e7a
|
fix: Turn parallel tool calling off for OpenAI (#2737)
|
2025-06-10 13:27:00 -07:00 |
|
cthomas
|
20e6732f36
|
feat: add multi-modal input support (#2590)
|
2025-06-08 18:28:01 -07:00 |
|
cthomas
|
e4da78fce7
|
fix: gracefully handle too long responses from llm provider (#2677)
|
2025-06-06 13:13:32 -07:00 |
|
Andy Li
|
d2252f2953
|
feat: otel metrics and expanded collecting (#2647)
(passed tests in last run)
|
2025-06-05 17:20:14 -07:00 |
|
Matthew Zhou
|
82b3222a52
|
fix: Make OpenAI context window exceeded error more specific (#2624)
|
2025-06-04 12:57:51 -07:00 |
|
Matthew Zhou
|
87f4bcad9a
|
feat: Add summarization for more scenarios (#2499)
|
2025-05-29 11:10:13 -07:00 |
|
Sarah Wooders
|
3354f5fe50
|
feat: concurrently make embedding request and use async client for OpenAI (#2482)
Co-authored-by: Matthew Zhou <mattzh1314@gmail.com>
|
2025-05-28 11:35:22 -07:00 |
|
Sarah Wooders
|
4cc075f1fc
|
feat: add more trace methods (#2471)
|
2025-05-27 16:56:30 -07:00 |
|
cthomas
|
e813a65351
|
feat(asyncify): byok in async loop (#2421)
|
2025-05-25 19:47:20 -07:00 |
|
Shangyin Tan
|
2199d8fdda
|
fix: do not pass temperature to request if model is oai reasoning model (#2189)
Co-authored-by: Charles Packer <packercharles@gmail.com>
|
2025-05-24 21:34:18 -07:00 |
|
cthomas
|
b554171d41
|
feat: add tracing to llm clients (#2340)
|
2025-05-22 13:55:32 -07:00 |
|
cthomas
|
095a14cd1d
|
ci: use experimental for send message tests (#2290)
Co-authored-by: Sarah Wooders <sarahwooders@gmail.com>
|
2025-05-20 18:39:27 -07:00 |
|
Matthew Zhou
|
26ae9c4502
|
feat: Add tavily search builtin tool (#2257)
|
2025-05-19 16:38:11 -07:00 |
|
Andy Li
|
a78abc610e
|
feat: track llm provider traces and tracking steps in async agent loop (#2219)
|
2025-05-19 15:50:56 -07:00 |
|
Sarah Wooders
|
65f8db2efd
|
feat: support together in new agent loop and add tests (#2231)
|
2025-05-17 19:17:08 -07:00 |
|
Kevin Lin
|
9714a0ace4
|
fix: use auto function calling for together models (#2097)
|
2025-05-09 17:46:35 -07:00 |
|
Charles Packer
|
fce28c73e3
|
fix: make togetherai nebius xai etc usable via the openaiprovider (#1981)
Co-authored-by: Kevin Lin <klin5061@gmail.com>
Co-authored-by: Kevin Lin <kl2806@columbia.edu>
|
2025-05-09 10:50:55 -07:00 |
|
cthomas
|
db6982a4bc
|
feat: add provider_category field to distinguish byok (#2038)
|
2025-05-06 17:31:36 -07:00 |
|
Charles Packer
|
326bbc5a04
|
fix: patch o1 support (#1978)
|
2025-05-02 14:54:25 -07:00 |
|
cthomas
|
c4f603d7b6
|
feat: always add user id to openai requests (#1969)
|
2025-04-30 23:23:01 -07:00 |
|
cthomas
|
18db9b9509
|
feat: byok 2.0 (#1963)
|
2025-04-30 21:26:50 -07:00 |
|
cthomas
|
4016201087
|
feat: use new model-proxy in production (#1908)
|
2025-04-30 15:20:54 -07:00 |
|
cthomas
|
6609372676
|
feat: add letta-free endpoint constant (#1907)
|
2025-04-27 12:57:06 -07:00 |
|
cthomas
|
ce2e8f5c4d
|
feat: add llm config per request (#1866)
|
2025-04-23 16:37:05 -07:00 |
|
Charles Packer
|
9f12d71916
|
fix: patch o-series (#1699)
|
2025-04-23 13:41:34 -07:00 |
|
Sarah Wooders
|
63395514cb
|
feat: translate system to developer or o-series models (#1692)
Co-authored-by: cpacker <packercharles@gmail.com>
|
2025-04-11 21:59:48 -07:00 |
|
Matthew Zhou
|
74e299a05f
|
fix: Fix build request data for OpenAI (#1654)
|
2025-04-09 16:31:20 -07:00 |
|
Sarah Wooders
|
b4e19f9a70
|
fix: patch summarizer for google and use new client (#1639)
|
2025-04-08 21:10:48 -07:00 |
|
Matthew Zhou
|
3797b0d536
|
feat: Simplify arguments for LLM clients (#1536)
|
2025-04-02 14:26:27 -07:00 |
|
Matthew Zhou
|
4fe496f3f3
|
feat: New openai client (#1460)
|
2025-03-31 13:08:59 -07:00 |
|