* fix: strip whitespace from API keys in LLM client headers Fixes httpx.LocalProtocolError when API keys contain leading/trailing whitespace. Strips whitespace from API keys before using them in HTTP headers across: - OpenAI client (openai.py) - Mistral client (mistral.py) - Anthropic client (anthropic_client.py) - Anthropic schema provider (schemas/providers/anthropic.py) - Google AI client (google_ai_client.py) - Proxy helpers (proxy_helpers.py) 🐾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> * fix: handle McpError gracefully in MCP client execute_tool Return error as failed result instead of re-raising to avoid Datadog alerts for expected user-facing errors like missing tool arguments. * fix: strip whitespace from API keys before passing to httpx client Fixes httpx.LocalProtocolError by stripping leading/trailing whitespace from API keys before passing them to OpenAI/AsyncOpenAI clients. The OpenAI client library constructs Authorization headers internally, and invalid header values (like keys with leading spaces) cause protocol errors. Applied fix to: - azure_client.py (AzureOpenAI/AsyncAzureOpenAI) - deepseek_client.py (OpenAI/AsyncOpenAI) - openai_client.py (OpenAI/AsyncOpenAI via kwargs) - xai_client.py (OpenAI/AsyncOpenAI) 🐾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> * fix: handle JSONDecodeError in OpenAI client requests Catches json.JSONDecodeError from OpenAI SDK when API returns invalid JSON (typically HTML error pages from 500-series errors) and converts to LLMServerError with helpful details. 🐾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> * fix(core): strip API key whitespace at schema level on write/create Add field_validator to ProviderCreate, ProviderUpdate, and ProviderCheck schemas to strip whitespace from api_key and access_key fields before persistence. This ensures keys are clean at the point of entry, preventing whitespace from being encrypted and stored in the database. Co-authored-by: Kian Jones <kianjones9@users.noreply.github.com> * refactor: remove api_key.strip() calls across all LLM clients Remove redundant .strip() calls on api_key parameters since pydantic models now handle whitespace trimming at the validation layer. This centralizes the validation logic and follows DRY principles. - Updated 13 files across multiple LLM client implementations - Removed 34 occurrences of api_key.strip() - Includes: OpenAI, Anthropic, Azure, Google AI, Groq, XAI, DeepSeek, ZAI, Together, Mistral - Also updated proxy helpers and provider schemas 👾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> * refactor: remove redundant ternary operators from api_key parameters Remove `if api_key else None` ternaries since pydantic validation ensures api_key is either a valid string or None. The ternary was defensive programming that's now unnecessary with proper model-level validation. - Simplified 23 occurrences across 7 files - Cleaner, more concise client initialization code - No behavioral change since pydantic already handles this 👾 Generated with [Letta Code](https://letta.com) Co-Authored-By: Letta <noreply@letta.com> --------- Co-authored-by: Letta <noreply@letta.com> Co-authored-by: letta-code <248085862+letta-code@users.noreply.github.com> Co-authored-by: Kian Jones <kianjones9@users.noreply.github.com>
Letta (formerly MemGPT)
Letta is the platform for building stateful agents: AI with advanced memory that can learn and self-improve over time.
- Letta Code: run agents locally in your terminal
- Letta API: build agents into your applications
Get started in the CLI
Requires Node.js 18+
- Install the Letta Code CLI tool:
npm install -g @letta-ai/letta-code - Run
lettain your terminal to launch an agent with memory running on your local computer
When running the CLI tool, your agent help you code and do any task you can do on your computer.
Letta Code supports skills and subagents, and bundles pre-built skills/subagents for advanced memory and continual learning. Letta is fully model-agnostic, though we recommend Opus 4.5 and GPT-5.2 for best performance (see our model leaderboard for our rankings).
Get started with the Letta API
Use the Letta API to integrate stateful agents into your own applications. Letta has a full-featured agents API, and a Python and Typescript SDK (view our API reference).
Installation
TypeScript / Node.js:
npm install @letta-ai/letta-client
Python:
pip install letta-client
Hello World example
Below is a quick example of creating a stateful agent and sending it a message (requires a Letta API key). See the full quickstart guide for complete documentation.
TypeScript:
import Letta from "@letta-ai/letta-client";
const client = new Letta({ apiKey: process.env.LETTA_API_KEY });
// Create your agent
const agentState = await client.agents.create({
model: "openai/gpt-5.2",
memory_blocks: [
{
label: "human",
value:
"Name: Timber. Status: dog. Occupation: building Letta, infrastructure to democratize self-improving superintelligence",
},
{
label: "persona",
value:
"I am a self-improving superintelligence. Timber is my best friend and collaborator.",
},
],
tools: ["web_search", "fetch_webpage"],
});
console.log("Agent created with ID:", agentState.id);
// Send your agent a message
const response = await client.agents.messages.create(agentState.id, {
input: "What do you know about me?",
});
for (const message of response.messages) {
console.log(message);
}
Python:
from letta_client import Letta
import os
client = Letta(api_key=os.getenv("LETTA_API_KEY"))
# Create your agent
agent_state = client.agents.create(
model="openai/gpt-5.2",
memory_blocks=[
{
"label": "human",
"value": "Name: Timber. Status: dog. Occupation: building Letta, infrastructure to democratize self-improving superintelligence"
},
{
"label": "persona",
"value": "I am a self-improving superintelligence. Timber is my best friend and collaborator."
}
],
tools=["web_search", "fetch_webpage"]
)
print(f"Agent created with ID: {agent_state.id}")
# Send your agent a message
response = client.agents.messages.create(
agent_id=agent_state.id,
input="What do you know about me?"
)
for message in response.messages:
print(message)
Contributing
Letta is an open source project built by over a hundred contributors from around the world. There are many ways to get involved in the Letta OSS project!
- Join the Discord: Chat with the Letta devs and other AI developers.
- Chat on our forum: If you're not into Discord, check out our developer forum.
- Follow our socials: Twitter/X, LinkedIn, YouTube
Legal notices: By using Letta and related Letta services (such as the Letta endpoint or hosted service), you are agreeing to our privacy policy and terms of service.
