From c8b89e25d068d38e113c210c17ed9a00d13fb1e2 Mon Sep 17 00:00:00 2001 From: Charles Packer Date: Sun, 22 Oct 2023 23:13:49 -0700 Subject: [PATCH] Update README.md --- memgpt/local_llm/README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/memgpt/local_llm/README.md b/memgpt/local_llm/README.md index a1f7b759..a8c2304a 100644 --- a/memgpt/local_llm/README.md +++ b/memgpt/local_llm/README.md @@ -1,4 +1,4 @@ -## tl;dr - how to connect MemGPT to non-OpenAI LLMs +## How to connect MemGPT to non-OpenAI LLMs **If you have a hosted ChatCompletion-compatible endpoint that works with function calling**: - simply set `OPENAI_API_BASE` to the IP+port of your endpoint: @@ -7,7 +7,7 @@ export OPENAI_API_BASE=... ``` -Note: for this to work, the endpoint MUST support function calls. As of 10/22/2023, most ChatCompletion endpoints do NOT support function calls, so if you want to play with MemGPT and open models, follow the instructions below. +Note: for this to work, the endpoint **MUST** support function calls. As of 10/22/2023, most ChatCompletion endpoints do **NOT** support function calls, so if you want to play with MemGPT and open models, you probably need to follow the instructions below. ## Integrating a function-call finetuned LLM with MemGPT