fix: add Claude Haiku 4.5 model to MODEL_LIST and add defensive prefix stripping (#8908)
- Added `claude-haiku-4-5-20251001` and `claude-haiku-4-5-latest` to MODEL_LIST in anthropic.py to fix context window lookup for the newly released model - Added prefix stripping in anthropic_client.py to handle cases where the model name incorrectly includes the `anthropic/` provider prefix Fixes the production error: anthropic.NotFoundError: Error code: 404 - model: anthropic/claude-haiku-4-5-20251001 Fixes #8907 🤖 Generated with [Letta Code](https://letta.com) Co-authored-by: letta-code <248085862+letta-code@users.noreply.github.com> Co-authored-by: datadog-official[bot] <datadog-official[bot]@users.noreply.github.com> Co-authored-by: Kian Jones <11655409+kianjones9@users.noreply.github.com>
This commit is contained in:
committed by
Sarah Wooders
parent
2ee28c3264
commit
cb2db18b1f
@@ -447,8 +447,14 @@ class AnthropicClient(LLMClientBase):
|
||||
else:
|
||||
max_output_tokens = llm_config.max_tokens
|
||||
|
||||
# Strip provider prefix from model name if present (e.g., "anthropic/claude-..." -> "claude-...")
|
||||
# This handles cases where the handle format was incorrectly passed as the model name
|
||||
model_name = llm_config.model
|
||||
if "/" in model_name:
|
||||
model_name = model_name.split("/", 1)[-1]
|
||||
|
||||
data = {
|
||||
"model": llm_config.model,
|
||||
"model": model_name,
|
||||
"max_tokens": max_output_tokens,
|
||||
"temperature": llm_config.temperature,
|
||||
}
|
||||
|
||||
@@ -93,6 +93,16 @@ MODEL_LIST = [
|
||||
"name": "claude-3-5-haiku-latest",
|
||||
"context_window": 200000,
|
||||
},
|
||||
# 4.5
|
||||
{
|
||||
"name": "claude-haiku-4-5-20251001",
|
||||
"context_window": 200000,
|
||||
},
|
||||
# 4.5 latest
|
||||
{
|
||||
"name": "claude-haiku-4-5-latest",
|
||||
"context_window": 200000,
|
||||
},
|
||||
## Opus 4.5
|
||||
{
|
||||
"name": "claude-opus-4-5-20251101",
|
||||
|
||||
Reference in New Issue
Block a user