Update README.md

This commit is contained in:
Charles Packer
2023-10-25 14:09:36 -07:00
committed by GitHub
parent 18fecb8d89
commit 246e1e955a

View File

@@ -20,7 +20,7 @@ When using open LLMs with MemGPT, **the main failure case will be your LLM outpu
<details>
<summary><h2>🖥️ Serving your LLM from a web server (WebUI example)</strong></h2></summary>
⁉️ Do **NOT** enable any extensions in web UI, including the openai extension! Just run web UI as-is
⁉️ Do **NOT** enable any extensions in web UI, including the [openai extension](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai)! Just run web UI as-is
To get MemGPT to work with a local LLM, you need to have the LLM running on a server that takes API requests.