Files
letta-server/letta/llm_api/zai_client.py
amysguan 612a2ae98b Fix: Change Z.ai context window to account for max_token subtraction (#9710)
fix zai context window (functionally [advertised context window] - [max output tokens]) and properly pass in max tokens so Z.ai doesn't default to 65k for GLM-5
2026-03-03 18:34:02 -08:00

8.4 KiB