From da3ad8f9d251ecdabac5add31cbbccc81b5088cd Mon Sep 17 00:00:00 2001 From: Charles Packer Date: Wed, 25 Oct 2023 00:39:54 -0700 Subject: [PATCH] Update README.md --- memgpt/local_llm/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/memgpt/local_llm/README.md b/memgpt/local_llm/README.md index 7f8c4596..9738b903 100644 --- a/memgpt/local_llm/README.md +++ b/memgpt/local_llm/README.md @@ -6,7 +6,7 @@ If you have a hosted ChatCompletion-compatible endpoint that works with function # ⚡ Quick overview -1. Put your own LLM behind a web server API of your choice +1. Put your own LLM behind a web server API (e.g. [oobabooga web UI](https://github.com/oobabooga/text-generation-webui#starting-the-web-ui)) 2. Set `OPENAI_API_BASE=YOUR_API_IP_ADDRESS` and `BACKEND_TYPE=webui` 3. Run MemGPT with `python3 main.py --no_verify`, it should now use your LLM instead of OpenAI GPT 4. If things aren't working, read the full instructions below