* feat: add MiniMax provider support Add MiniMax as a new LLM provider using their Anthropic-compatible API. Key implementation details: - Uses standard messages API (not beta) - MiniMax supports thinking blocks natively - Base URL: https://api.minimax.io/anthropic - Models: MiniMax-M2.1, MiniMax-M2.1-lightning, MiniMax-M2 (all 200K context, 128K output) - Temperature clamped to valid range (0.0, 1.0] - All M2.x models treated as reasoning models (support interleaved thinking) Files added: - letta/schemas/providers/minimax.py - MiniMax provider schema - letta/llm_api/minimax_client.py - Client extending AnthropicClient - tests/test_minimax_client.py - Unit tests (13 tests) - tests/model_settings/minimax-m2.1.json - Integration test config 🐾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> * chore: regenerate API spec with MiniMax provider 🐾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> * chore: use MiniMax-M2.1-lightning for CI tests Switch to the faster/cheaper lightning model variant for integration tests. 🐾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> * chore: add MINIMAX_API_KEY to deploy-core command Co-authored-by: Sarah Wooders <sarahwooders@users.noreply.github.com> * chore: regenerate web openapi spec with MiniMax provider Co-authored-by: Sarah Wooders <sarahwooders@users.noreply.github.com> 🐾 Generated with [Letta Code](https://letta.com) --------- Co-authored-by: Letta <noreply@letta.com> Co-authored-by: letta-code <248085862+letta-code@users.noreply.github.com> Co-authored-by: Sarah Wooders <sarahwooders@users.noreply.github.com>
10 lines
196 B
JSON
10 lines
196 B
JSON
{
|
|
"handle": "minimax/MiniMax-M2.1-lightning",
|
|
"model_settings": {
|
|
"provider_type": "minimax",
|
|
"temperature": 1.0,
|
|
"max_output_tokens": 4096,
|
|
"parallel_tool_calls": false
|
|
}
|
|
}
|