merge this (#4759)
* wait I forgot to comit locally * cp the entire core directory and then rm the .git subdir
This commit is contained in:
44
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
44
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Create a report to help us improve
|
||||
title: ''
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Describe the bug**
|
||||
A clear and concise description of what the bug is.
|
||||
|
||||
**Please describe your setup**
|
||||
- [ ] How are you running Letta?
|
||||
- Docker
|
||||
- pip (legacy)
|
||||
- From source
|
||||
- Desktop
|
||||
- [ ] Describe your setup
|
||||
- What's your OS (Windows/MacOS/Linux)?
|
||||
- What is your `docker run ...` command (if applicable)
|
||||
|
||||
**Screenshots**
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the problem here.
|
||||
- What model you are using
|
||||
|
||||
**Agent File (optional)**
|
||||
Please attach your `.af` file, as this helps with reproducing issues.
|
||||
|
||||
|
||||
---
|
||||
|
||||
If you're not using OpenAI, please provide additional information on your local LLM setup:
|
||||
|
||||
**Local LLM details**
|
||||
|
||||
If you are trying to run Letta with local LLMs, please provide the following information:
|
||||
|
||||
- [ ] The exact model you're trying to use (e.g. `dolphin-2.1-mistral-7b.Q6_K.gguf`)
|
||||
- [ ] The local LLM backend you are using (web UI? LM Studio?)
|
||||
- [ ] Your hardware for the local LLM backend (local computer? operating system? remote RunPod?)
|
||||
Reference in New Issue
Block a user