From a2bf5a2b586ae21aeaf22cc4f075f21fa76ffa87 Mon Sep 17 00:00:00 2001 From: Charles Packer Date: Fri, 1 Dec 2023 11:23:43 -0800 Subject: [PATCH] update docs (#547) * update admonitions * Update local_llm.md * Update webui.md * Update autogen.md * Update storage.md * Update example_chat.md * Update example_data.md * Update example_chat.md * Update example_data.md --- docs/autogen.md | 8 ++++---- docs/data_sources.md | 2 +- docs/example_chat.md | 7 ++++--- docs/example_data.md | 7 ++++--- docs/local_llm.md | 4 ++-- docs/storage.md | 3 ++- docs/webui.md | 2 +- 7 files changed, 18 insertions(+), 15 deletions(-) diff --git a/docs/autogen.md b/docs/autogen.md index ffe98c03..6239349a 100644 --- a/docs/autogen.md +++ b/docs/autogen.md @@ -1,4 +1,4 @@ -!!! warning "Need help?" +!!! question "Need help?" If you need help visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel. @@ -20,7 +20,7 @@ For the purposes of this example, we're going to serve (host) the LLMs using [oo Install web UI and get a model set up on a local web server. You can use [our instructions on setting up web UI](https://memgpt.readthedocs.io/en/latest/webui/). -!!! warning "Choosing an LLM / model to use" +!!! info "Choosing an LLM / model to use" You'll need to decide on an LLM / model to use with web UI. @@ -36,7 +36,7 @@ Try setting up MemGPT with your local web UI backend [using the instructions her Once you've confirmed that you're able to chat with a MemGPT agent using `memgpt configure` and `memgpt run`, you're ready to move on to the next step. -!!! warning "Using RunPod as an LLM backend" +!!! info "Using RunPod as an LLM backend" If you're using RunPod to run web UI, make sure that you set your endpoint to the RunPod IP address, **not the default localhost address**. @@ -134,7 +134,7 @@ config_list_memgpt = [ ] ``` -!!! warning "Making internal monologue visible to AutoGen" +!!! info "Making internal monologue visible to AutoGen" By default, MemGPT's inner monologue and function traces are hidden from other AutoGen agents. diff --git a/docs/data_sources.md b/docs/data_sources.md index bab0ca33..f119f98c 100644 --- a/docs/data_sources.md +++ b/docs/data_sources.md @@ -35,7 +35,7 @@ memgpt attach --agent --data-source ``` -!!! note "Hint" +!!! tip "Hint" To encourage your agent to reference its archival memory, we recommend adding phrases like "_search your archival memory..._" for the best results. diff --git a/docs/example_chat.md b/docs/example_chat.md index 1e5bacf7..d3ffa802 100644 --- a/docs/example_chat.md +++ b/docs/example_chat.md @@ -1,7 +1,8 @@ +!!! note "Note" + + Before starting this example, make sure that you've [properly installed MemGPT](../quickstart) + ## Using MemGPT to create a perpetual chatbot - -_Before starting this example, make sure that you've [properly installed MemGPT](../quickstart)._ - In this example, we're going to use MemGPT to create a chatbot with a custom persona. MemGPT chatbots are "perpetual chatbots", meaning that they can be run indefinitely without any context length limitations. MemGPT chatbots are self-aware that they have a "fixed context window", and will manually manage their own memories to get around this problem by moving information in and out of their small memory window and larger external storage. MemGPT chatbots always keep a reserved space in their "core" memory window to store their `persona` information (describes the bot's personality + basic functionality), and `human` information (which describes the human that the bot is chatting with). The MemGPT chatbot will update the `persona` and `human` core memory blocks over time as it learns more about the user (and itself). diff --git a/docs/example_data.md b/docs/example_data.md index bef5ab8f..c0663a18 100644 --- a/docs/example_data.md +++ b/docs/example_data.md @@ -1,7 +1,8 @@ +!!! note "Note" + + Before starting this example, make sure that you've [properly installed MemGPT](../quickstart) + ## Using MemGPT to chat with your own data - -_Before starting this example, make sure that you've [properly installed MemGPT](../quickstart)._ - In this example, we're going to use MemGPT to chat with a custom data source. Specifically, we'll try loading in the MemGPT research paper and ask MemGPT questions about it. ### Creating an external data source diff --git a/docs/local_llm.md b/docs/local_llm.md index 4f2fc61e..a4435fd1 100644 --- a/docs/local_llm.md +++ b/docs/local_llm.md @@ -1,4 +1,4 @@ -!!! warning "Need help?" +!!! question "Need help?" If you need help visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel. @@ -104,7 +104,7 @@ If you would like us to support a new backend, feel free to open an issue or pul ### Which model should I use? -!!! warning "Recommended LLMs / models" +!!! info "Recommended LLMs / models" To see a list of recommended LLMs to use with MemGPT, visit our [Discord server](https://discord.gg/9GEQrxmVyE) and check the #model-chat channel. diff --git a/docs/storage.md b/docs/storage.md index 72bfbac8..bfde4d97 100644 --- a/docs/storage.md +++ b/docs/storage.md @@ -1,10 +1,11 @@ # Configuring Storage Backends -MemGPT supports both local and database storage for archival memory. You can configure which storage backend to use via `memgpt configure`. For larger datasets, we recommend using a database backend. !!! warning "Switching storage backends" MemGPT can only use one storage backend at a time. If you switch from local to database storage, you will need to re-load data and start agents from scratch. We currently do not support migrating between storage backends. +MemGPT supports both local and database storage for archival memory. You can configure which storage backend to use via `memgpt configure`. For larger datasets, we recommend using a database backend. + ## Local MemGPT will default to using local storage (saved at `~/.memgpt/archival/` for loaded data sources, and `~/.memgpt/agents/` for agent storage). diff --git a/docs/webui.md b/docs/webui.md index 85fb2508..fe7594fa 100644 --- a/docs/webui.md +++ b/docs/webui.md @@ -1,4 +1,4 @@ -!!! warning "web UI troubleshooting" +!!! question "web UI troubleshooting" If you have problems getting web UI set up, please use the [official web UI repo for support](https://github.com/oobabooga/text-generation-webui)! There will be more answered questions about web UI there vs here on the MemGPT repo.