migrate fern

This commit is contained in:
Kian Jones
2025-09-09 09:31:59 -07:00
parent 0c5f5dadb8
commit 1881fcc89d
339 changed files with 50713 additions and 0 deletions

View File

@@ -0,0 +1,25 @@
---
slug: python-reference/AgentState
---
<a id="letta.schemas.agent.AgentState"></a>
## AgentState
```python
class AgentState(BaseAgent)
```
Representation of an agent's state. This is the state of the agent at a given time, and is persisted in the DB backend. The state has all the information needed to recreate a persisted agent.
**Arguments**:
- `id` _str_ - The unique identifier of the agent.
- `name` _str_ - The name of the agent (must be unique to the user).
- `created_at` _datetime_ - The datetime the agent was created.
- `message_ids` _List[str]_ - The ids of the messages in the agent's in-context memory.
- `memory` _Memory_ - The in-context memory of the agent.
- `tools` _List[str]_ - The tools used by the agent. This includes any memory editing functions specified in `memory`.
- `system` _str_ - The system prompt used by the agent.
- `llm_config` _LLMConfig_ - The LLM configuration used by the agent.
- `embedding_config` _EmbeddingConfig_ - The embedding configuration used by the agent.

View File

@@ -0,0 +1,24 @@
---
slug: python-reference/Block
---
<a id="letta.schemas.block.Block"></a>
## Block
```python
class Block(BaseBlock)
```
A Block represents a reserved section of the LLM's context window which is editable. `Block` objects contained in the `Memory` object, which is able to edit the Block values.
**Arguments**:
- `name` _str_ - The name of the block.
- `value` _str_ - The value of the block. This is the string that is represented in the context window.
- `limit` _int_ - The character limit of the block.
- `template` _bool_ - Whether the block is a template (e.g. saved human/persona options). Non-template blocks are not stored in the database and are ephemeral, while templated blocks are stored in the database.
- `label` _str_ - The label of the block (e.g. 'human', 'persona'). This defines a category for the block.
- `description` _str_ - Description of the block.
- `metadata` _Dict_ - Metadata of the block.
- `user_id` _str_ - The unique identifier of the user associated with the block.

View File

@@ -0,0 +1,48 @@
---
slug: python-reference/DataConnector
---
<a id="letta.data_sources.connectors.DataConnector"></a>
## DataConnector
```python
class DataConnector()
```
Base class for data connectors that can be extended to generate documents and passages from a custom data source.
<a id="letta.data_sources.connectors.DataConnector.generate_documents"></a>
#### generate\_documents
```python
def generate_documents() -> Iterator[Tuple[str, Dict]]
```
Generate document text and metadata from a data source.
**Returns**:
- `documents` _Iterator[Tuple[str, Dict]]_ - Generate a tuple of string text and metadata dictionary for each document.
<a id="letta.data_sources.connectors.DataConnector.generate_passages"></a>
#### generate\_passages
```python
def generate_passages(documents: List[Document],
chunk_size: int = 1024) -> Iterator[Tuple[str, Dict]]
```
Generate passage text and metadata from a list of documents.
**Arguments**:
- `documents` _List[Document]_ - List of documents to generate passages from.
- `chunk_size` _int, optional_ - Chunk size for splitting passages. Defaults to 1024.
**Returns**:
- `passages` _Iterator[Tuple[str, Dict]]_ - Generate a tuple of string text and metadata dictionary for each passage.

View File

@@ -0,0 +1,31 @@
---
slug: python-reference/DirectoryConnector
---
<a id="letta.data_sources.connectors.DirectoryConnector"></a>
## DirectoryConnector
```python
class DirectoryConnector(DataConnector)
```
<a id="letta.data_sources.connectors.DirectoryConnector.__init__"></a>
#### \_\_init\_\_
```python
def __init__(input_files: List[str] = None,
input_directory: str = None,
recursive: bool = False,
extensions: List[str] = None)
```
Connector for reading text data from a directory of files.
**Arguments**:
- `input_files` _List[str], optional_ - List of file paths to read. Defaults to None.
- `input_directory` _str, optional_ - Directory to read files from. Defaults to None.
- `recursive` _bool, optional_ - Whether to read files recursively from the input directory. Defaults to False.
- `extensions` _List[str], optional_ - List of file extensions to read. Defaults to None.

View File

@@ -0,0 +1,13 @@
---
slug: python-reference/Document
---
<a id="letta.schemas.document.Document"></a>
## Document
```python
class Document(DocumentBase)
```
Representation of a single document (broken up into `Passage` objects)

View File

@@ -0,0 +1,24 @@
---
slug: python-reference/EmbeddingConfig
---
<a id="letta.schemas.embedding_config.EmbeddingConfig"></a>
## EmbeddingConfig
```python
class EmbeddingConfig(BaseModel)
```
Embedding model configuration. This object specifies all the information necessary to access an embedding model to usage with Letta, except for secret keys.
**Attributes**:
- `embedding_endpoint_type` _str_ - The endpoint type for the model.
- `embedding_endpoint` _str_ - The endpoint for the model.
- `embedding_model` _str_ - The model for the embedding.
- `embedding_dim` _int_ - The dimension of the embedding.
- `embedding_chunk_size` _int_ - The chunk size of the embedding.
- `azure_endpoint` _:obj:`str`, optional_ - The Azure endpoint for the model (Azure only).
- `azure_version` _str_ - The Azure version for the model (Azure only).
- `azure_deployment` _str_ - The Azure deployment for the model (Azure only).

View File

@@ -0,0 +1,21 @@
---
slug: python-reference/Job
---
<a id="letta.schemas.job.Job"></a>
## Job
```python
class Job(JobBase)
```
Representation of offline jobs, used for tracking status of data loading tasks (involving parsing and embedding documents).
**Arguments**:
- `id` _str_ - The unique identifier of the job.
- `status` _JobStatus_ - The status of the job.
- `created_at` _datetime_ - The unix timestamp of when the job was created.
- `completed_at` _datetime_ - The unix timestamp of when the job was completed.
- `user_id` _str_ - The unique identifier of the user associated with the.

View File

@@ -0,0 +1,21 @@
---
slug: python-reference/LLMConfig
---
<a id="letta.schemas.llm_config.LLMConfig"></a>
## LLMConfig
```python
class LLMConfig(BaseModel)
```
Configuration for a Language Model (LLM) model. This object specifies all the information necessary to access an LLM model to usage with Letta, except for secret keys.
**Attributes**:
- `model` _str_ - The name of the LLM model.
- `model_endpoint_type` _str_ - The endpoint type for the model.
- `model_endpoint` _str_ - The endpoint for the model.
- `model_wrapper` _str_ - The wrapper for the model.
- `context_window` _int_ - The context window size for the model.

View File

@@ -0,0 +1,67 @@
---
slug: python-reference/LettaMessage
---
<a id="letta.schemas.letta_message.LettaMessage"></a>
## LettaMessage
```python
class LettaMessage(BaseModel)
```
Base class for simplified Letta message response type. This is intended to be used for developers who want the internal monologue, function calls, and function returns in a simplified format that does not include additional information other than the content and timestamp.
**Attributes**:
- `id` _str_ - The ID of the message
- `date` _datetime_ - The date the message was created in ISO format
<a id="letta.schemas.letta_message.InternalMonologue"></a>
## InternalMonologue
```python
class InternalMonologue(LettaMessage)
```
Representation of an agent's internal monologue.
**Attributes**:
- `internal_monologue` _str_ - The internal monologue of the agent
- `id` _str_ - The ID of the message
- `date` _datetime_ - The date the message was created in ISO format
<a id="letta.schemas.letta_message.FunctionCallMessage"></a>
## FunctionCallMessage
```python
class FunctionCallMessage(LettaMessage)
```
A message representing a request to call a function (generated by the LLM to trigger function execution).
**Attributes**:
- `function_call` _Union[FunctionCall, FunctionCallDelta]_ - The function call
- `id` _str_ - The ID of the message
- `date` _datetime_ - The date the message was created in ISO format
<a id="letta.schemas.letta_message.FunctionReturn"></a>
## FunctionReturn
```python
class FunctionReturn(LettaMessage)
```
A message representing the return value of a function call (generated by Letta executing the requested function).
**Attributes**:
- `function_return` _str_ - The return value of the function
- `status` _Literal["success", "error"]_ - The status of the function call
- `id` _str_ - The ID of the message
- `date` _datetime_ - The date the message was created in ISO format

View File

@@ -0,0 +1,19 @@
---
slug: python-reference/LettaResponse
---
<a id="letta.schemas.letta_response.LettaResponse"></a>
## LettaResponse
```python
class LettaResponse(BaseModel)
```
Response object from an agent interaction, consisting of the new messages generated by the agent and usage statistics.
The type of the returned messages can be either `Message` or `LettaMessage`, depending on what was specified in the request.
**Attributes**:
- `messages` _List[Union[Message, LettaMessage]]_ - The messages returned by the agent.
- `usage` _LettaUsageStatistics_ - The usage statistics

View File

@@ -0,0 +1,20 @@
---
slug: python-reference/LettaUsageStatistics
---
<a id="letta.schemas.usage.LettaUsageStatistics"></a>
## LettaUsageStatistics
```python
class LettaUsageStatistics(BaseModel)
```
Usage statistics for the agent interaction.
**Attributes**:
- `completion_tokens` _int_ - The number of tokens generated by the agent.
- `prompt_tokens` _int_ - The number of tokens in the prompt.
- `total_tokens` _int_ - The total number of tokens processed by the agent.
- `step_count` _int_ - The number of steps taken by the agent.

View File

@@ -0,0 +1,231 @@
---
slug: python-reference/Memory
---
<a id="letta.schemas.memory.Memory"></a>
## Memory
```python
class Memory(BaseModel)
```
Represents the in-context memory of the agent. This includes both the `Block` objects (labelled by sections), as well as tools to edit the blocks.
**Attributes**:
- `memory` _Dict[str, Block]_ - Mapping from memory block section to memory block.
<a id="letta.schemas.memory.Memory.get_prompt_template"></a>
#### get\_prompt\_template
```python
def get_prompt_template() -> str
```
Return the current Jinja2 template string.
<a id="letta.schemas.memory.Memory.set_prompt_template"></a>
#### set\_prompt\_template
```python
def set_prompt_template(prompt_template: str)
```
Set a new Jinja2 template string.
Validates the template syntax and compatibility with current memory structure.
<a id="letta.schemas.memory.Memory.load"></a>
#### load
```python
@classmethod
def load(cls, state: dict)
```
Load memory from dictionary object
<a id="letta.schemas.memory.Memory.compile"></a>
#### compile
```python
def compile() -> str
```
Generate a string representation of the memory in-context using the Jinja2 template
<a id="letta.schemas.memory.Memory.to_dict"></a>
#### to\_dict
```python
def to_dict()
```
Convert to dictionary representation
<a id="letta.schemas.memory.Memory.to_flat_dict"></a>
#### to\_flat\_dict
```python
def to_flat_dict()
```
Convert to a dictionary that maps directly from block names to values
<a id="letta.schemas.memory.Memory.list_block_names"></a>
#### list\_block\_names
```python
def list_block_names() -> List[str]
```
Return a list of the block names held inside the memory object
<a id="letta.schemas.memory.Memory.get_block"></a>
#### get\_block
```python
def get_block(name: str) -> Block
```
Correct way to index into the memory.memory field, returns a Block
<a id="letta.schemas.memory.Memory.get_blocks"></a>
#### get\_blocks
```python
def get_blocks() -> List[Block]
```
Return a list of the blocks held inside the memory object
<a id="letta.schemas.memory.Memory.link_block"></a>
#### link\_block
```python
def link_block(name: str, block: Block, override: Optional[bool] = False)
```
Link a new block to the memory object
<a id="letta.schemas.memory.Memory.update_block_value"></a>
#### update\_block\_value
```python
def update_block_value(name: str, value: str)
```
Update the value of a block
<a id="letta.schemas.memory.BasicBlockMemory"></a>
## BasicBlockMemory
```python
class BasicBlockMemory(Memory)
```
BasicBlockMemory is a basic implemention of the Memory class, which takes in a list of blocks and links them to the memory object. These are editable by the agent via the core memory functions.
**Attributes**:
- `memory` _Dict[str, Block]_ - Mapping from memory block section to memory block.
**Methods**:
- `core_memory_append` - Append to the contents of core memory.
- `core_memory_replace` - Replace the contents of core memory.
<a id="letta.schemas.memory.BasicBlockMemory.__init__"></a>
#### \_\_init\_\_
```python
def __init__(blocks: List[Block] = [])
```
Initialize the BasicBlockMemory object with a list of pre-defined blocks.
**Arguments**:
- `blocks` _List[Block]_ - List of blocks to be linked to the memory object.
<a id="letta.schemas.memory.BasicBlockMemory.core_memory_append"></a>
#### core\_memory\_append
```python
def core_memory_append(name: str, content: str) -> Optional[str]
```
Append to the contents of core memory.
**Arguments**:
- `name` _str_ - Section of the memory to be edited (persona or human).
- `content` _str_ - Content to write to the memory. All unicode (including emojis) are supported.
**Returns**:
- `Optional[str]` - None is always returned as this function does not produce a response.
<a id="letta.schemas.memory.BasicBlockMemory.core_memory_replace"></a>
#### core\_memory\_replace
```python
def core_memory_replace(name: str, old_content: str,
new_content: str) -> Optional[str]
```
Replace the contents of core memory. To delete memories, use an empty string for new_content.
**Arguments**:
- `name` _str_ - Section of the memory to be edited (persona or human).
- `old_content` _str_ - String to replace. Must be an exact match.
- `new_content` _str_ - Content to write to the memory. All unicode (including emojis) are supported.
**Returns**:
- `Optional[str]` - None is always returned as this function does not produce a response.
<a id="letta.schemas.memory.ChatMemory"></a>
## ChatMemory
```python
class ChatMemory(BasicBlockMemory)
```
ChatMemory initializes a BaseChatMemory with two default blocks, `human` and `persona`.
<a id="letta.schemas.memory.ChatMemory.__init__"></a>
#### \_\_init\_\_
```python
def __init__(persona: str, human: str, limit: int = 2000)
```
Initialize the ChatMemory object with a persona and human string.
**Arguments**:
- `persona` _str_ - The starter value for the persona block.
- `human` _str_ - The starter value for the human block.
- `limit` _int_ - The character limit for each block.

View File

@@ -0,0 +1,88 @@
---
slug: python-reference/Message
---
<a id="letta.schemas.message.Message"></a>
## Message
```python
class Message(BaseMessage)
```
Letta's internal representation of a message. Includes methods to convert to/from LLM provider formats.
**Attributes**:
- `id` _str_ - The unique identifier of the message.
- `role` _MessageRole_ - The role of the participant.
- `content` _List[MessageContent]_ - The content of the message.
- `user_id` _str_ - The unique identifier of the user.
- `agent_id` _str_ - The unique identifier of the agent.
- `model` _str_ - The model used to make the function call.
- `name` _str_ - The name of the participant.
- `created_at` _datetime_ - The time the message was created.
- `tool_calls` _List[ToolCall]_ - The list of tool calls requested.
- `tool_call_id` _str_ - The id of the tool call.
<a id="letta.schemas.message.Message.to_letta_message"></a>
#### to\_letta\_message
```python
def to_letta_message() -> List[LettaMessage]
```
Convert message object (in DB format) to the style used by the original Letta API
<a id="letta.schemas.message.Message.dict_to_message"></a>
#### dict\_to\_message
```python
@staticmethod
def dict_to_message(user_id: str,
agent_id: str,
openai_message_dict: dict,
model: Optional[str] = None,
allow_functions_style: bool = False,
created_at: Optional[datetime] = None,
id: Optional[str] = None)
```
Convert a ChatCompletion message object into a Message object (synced to DB)
<a id="letta.schemas.message.Message.to_openai_dict"></a>
#### to\_openai\_dict
```python
def to_openai_dict(max_tool_id_length: int = TOOL_CALL_ID_MAX_LEN,
put_inner_thoughts_in_kwargs: bool = False) -> dict
```
Go from Message class to ChatCompletion message object
<a id="letta.schemas.message.Message.to_anthropic_dict"></a>
#### to\_anthropic\_dict
```python
def to_anthropic_dict(inner_thoughts_xml_tag="thinking") -> dict
```
Convert to an Anthropic message dictionary
**Arguments**:
- `inner_thoughts_xml_tag` _str_ - The XML tag to wrap around inner thoughts
<a id="letta.schemas.message.Message.to_google_ai_dict"></a>
#### to\_google\_ai\_dict
```python
def to_google_ai_dict(put_inner_thoughts_in_kwargs: bool = True) -> dict
```
Go from Message class to Google AI REST message object

View File

@@ -0,0 +1,36 @@
---
slug: python-reference/Passage
---
<a id="letta.schemas.passage.Passage"></a>
## Passage
```python
class Passage(PassageBase)
```
Representation of a passage, which is stored in archival memory.
**Arguments**:
- `text` _str_ - The text of the passage.
- `embedding` _List[float]_ - The embedding of the passage.
- `embedding_config` _EmbeddingConfig_ - The embedding configuration used by the passage.
- `created_at` _datetime_ - The creation date of the passage.
- `user_id` _str_ - The unique identifier of the user associated with the passage.
- `agent_id` _str_ - The unique identifier of the agent associated with the passage.
- `source_id` _str_ - The data source of the passage.
- `doc_id` _str_ - The unique identifier of the document associated with the passage.
<a id="letta.schemas.passage.Passage.pad_embeddings"></a>
#### pad\_embeddings
```python
@field_validator("embedding")
@classmethod
def pad_embeddings(cls, embedding: List[float]) -> List[float]
```
Pad embeddings to `MAX_EMBEDDING_SIZE`. This is necessary to ensure all stored embeddings are the same size.

View File

@@ -0,0 +1,71 @@
---
slug: python-reference/Tool
---
<a id="letta.schemas.tool.Tool"></a>
## Tool
```python
class Tool(BaseTool)
```
Representation of a tool, which is a function that can be called by the agent.
**Arguments**:
- `id` _str_ - The unique identifier of the tool.
- `name` _str_ - The name of the function.
- `tags` _List[str]_ - Metadata tags.
- `source_code` _str_ - The source code of the function.
- `json_schema` _Dict_ - The JSON schema of the function.
<a id="letta.schemas.tool.Tool.to_dict"></a>
#### to\_dict
```python
def to_dict()
```
Convert tool into OpenAI representation.
<a id="letta.schemas.tool.Tool.from_langchain"></a>
#### from\_langchain
```python
@classmethod
def from_langchain(cls, langchain_tool) -> "Tool"
```
Class method to create an instance of Tool from a Langchain tool (must be from langchain_community.tools).
**Arguments**:
- `langchain_tool` _LangchainTool_ - An instance of a crewAI BaseTool (BaseTool from crewai)
**Returns**:
- `Tool` - A Letta Tool initialized with attributes derived from the provided crewAI BaseTool object.
<a id="letta.schemas.tool.Tool.from_crewai"></a>
#### from\_crewai
```python
@classmethod
def from_crewai(cls, crewai_tool) -> "Tool"
```
Class method to create an instance of Tool from a crewAI BaseTool object.
**Arguments**:
- `crewai_tool` _CrewAIBaseTool_ - An instance of a crewAI BaseTool (BaseTool from crewai)
**Returns**:
- `Tool` - A Letta Tool initialized with attributes derived from the provided crewAI BaseTool object.

View File

@@ -0,0 +1,25 @@
---
slug: python-reference/User
---
<a id="letta.schemas.user.User"></a>
## User
```python
class User(UserBase)
```
Representation of a user.
**Arguments**:
- `id` _str_ - The unique identifier of the user.
- `name` _str_ - The name of the user.
- `created_at` _datetime_ - The creation date of the user.
<a id="letta.schemas.user.User.org_id"></a>
#### org\_id
TODO: dont make optional, and pass in default org ID