fix(core): add default value for OllamaProvider.default_prompt_formatter (#8911)
When Ollama providers are synced to DB via sync_base_providers(), the
default_prompt_formatter field is lost because ProviderCreate doesn't
include it. When loading from DB and calling cast_to_subtype(), Pydantic
validation fails because the field is required.
This was a latent bug exposed when provider models persistence was
re-enabled in 0.16.2. The field was always required but never persisted.
Adding a default value ("chatml") fixes the issue. The field isn't
actually used in the current implementation - the model_wrapper line
is commented out in list_llm_models_async() since Ollama now uses
OpenAI-compatible endpoints.
Fixes: letta-ai/letta-code#587
👾 Generated with [Letta Code](https://letta.com)
Co-authored-by: Letta <noreply@letta.com>
This commit is contained in:
committed by
Sarah Wooders
parent
b0dfdd2725
commit
6472834130
@@ -24,7 +24,8 @@ class OllamaProvider(OpenAIProvider):
|
||||
base_url: str = Field(..., description="Base URL for the Ollama API.")
|
||||
api_key: str | None = Field(None, description="API key for the Ollama API (default: `None`).")
|
||||
default_prompt_formatter: str = Field(
|
||||
..., description="Default prompt formatter (aka model wrapper) to use on a /completions style API."
|
||||
default="chatml",
|
||||
description="Default prompt formatter (aka model wrapper) to use on a /completions style API.",
|
||||
)
|
||||
|
||||
@property
|
||||
|
||||
Reference in New Issue
Block a user