Files
letta-server/.github/ISSUE_TEMPLATE/bug_report.md
2026-01-18 13:50:17 -08:00

54 lines
1.5 KiB
Markdown

---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
> [!IMPORTANT]
> **🚨 Reporting a bug with Letta Code?**
>
> Please file your issue at **[letta-ai/letta-code](https://github.com/letta-ai/letta-code/issues)** instead!
>
> This repository is for the core Letta Docker server only. Issues related to the Letta Code CLI tool or agentic coding features should be reported in the Letta Code repository.
---
**Describe the bug**
A clear and concise description of what the bug is.
**Please describe your setup**
- [ ] How are you running Letta?
- Docker
- pip (legacy)
- From source
- Desktop
- [ ] Describe your setup
- What's your OS (Windows/MacOS/Linux)?
- What is your `docker run ...` command (if applicable)
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Additional context**
Add any other context about the problem here.
- What model you are using
**Agent File (optional)**
Please attach your `.af` file, as this helps with reproducing issues.
---
If you're not using OpenAI, please provide additional information on your local LLM setup:
**Local LLM details**
If you are trying to run Letta with local LLMs, please provide the following information:
- [ ] The exact model you're trying to use (e.g. `dolphin-2.1-mistral-7b.Q6_K.gguf`)
- [ ] The local LLM backend you are using (web UI? LM Studio?)
- [ ] Your hardware for the local LLM backend (local computer? operating system? remote RunPod?)