Files
letta-server/tests/configs/llm_model_configs/gemini-2.5-flash.json
cthomas 8b617c9e0d fix: gemini flash integration test [LET-4060] (#4242)
* fix: gemini flash integration test

* also update google flash

* catch error in test

* revert test changes

* do try catch again

* remove try catch from streaming tests

* add try catch for summarize test also
2025-08-27 11:59:15 -07:00

11 lines
287 B
JSON

{
"context_window": 2097152,
"model": "gemini-2.5-flash",
"model_endpoint_type": "google_ai",
"model_endpoint": "https://generativelanguage.googleapis.com",
"model_wrapper": null,
"put_inner_thoughts_in_kwargs": true,
"enable_reasoner": true,
"max_reasoning_tokens": 1
}