From 246e1e955aa5c24e2cb3c0f960056d2fa672bd5d Mon Sep 17 00:00:00 2001 From: Charles Packer Date: Wed, 25 Oct 2023 14:09:36 -0700 Subject: [PATCH] Update README.md --- memgpt/local_llm/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/memgpt/local_llm/README.md b/memgpt/local_llm/README.md index a200c664..ff5d740e 100644 --- a/memgpt/local_llm/README.md +++ b/memgpt/local_llm/README.md @@ -20,7 +20,7 @@ When using open LLMs with MemGPT, **the main failure case will be your LLM outpu

🖥️ Serving your LLM from a web server (WebUI example)

-⁉️ Do **NOT** enable any extensions in web UI, including the openai extension! Just run web UI as-is +⁉️ Do **NOT** enable any extensions in web UI, including the [openai extension](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai)! Just run web UI as-is To get MemGPT to work with a local LLM, you need to have the LLM running on a server that takes API requests.