Add documentation for using Hugging Face models for embeddings (#549)
This commit is contained in:
@@ -1,8 +1,6 @@
|
||||
### LLM Backends
|
||||
|
||||
You can use MemGPT with various LLM backends, including the OpenAI API, Azure OpenAI, and various local (or self-hosted) LLM backends.
|
||||
|
||||
#### OpenAI
|
||||
## OpenAI
|
||||
To use MemGPT with an OpenAI API key, simply set the `OPENAI_API_KEY` variable:
|
||||
```sh
|
||||
export OPENAI_API_KEY=YOUR_API_KEY # on Linux/Mac
|
||||
@@ -10,7 +8,11 @@ set OPENAI_API_KEY=YOUR_API_KEY # on Windows
|
||||
$Env:OPENAI_API_KEY = "YOUR_API_KEY" # on Windows (PowerShell)
|
||||
```
|
||||
|
||||
#### Azure
|
||||
#### OpenAI Proxies
|
||||
To use custom OpenAI endpoints, specify a proxy URL when running `memgpt configure` to set the custom endpoint as the default endpoint.
|
||||
|
||||
|
||||
## Azure
|
||||
To use MemGPT with Azure, expore the following variables and then re-run `memgpt configure`:
|
||||
```sh
|
||||
# see https://github.com/openai/openai-python#microsoft-azure-endpoints
|
||||
@@ -27,8 +29,5 @@ Replace `export` with `set` or `$Env:` if you are on Windows (see the OpenAI exa
|
||||
|
||||
Note: **your Azure endpoint must support functions** or you will get an error. See [this GitHub issue](https://github.com/cpacker/MemGPT/issues/91) for more information.
|
||||
|
||||
#### Custom endpoints
|
||||
To use custom OpenAI endpoints, run `export OPENAI_API_BASE=<MY_CUSTOM_URL>` and then re-run `memgpt configure` to set the custom endpoint as the default endpoint.
|
||||
|
||||
#### Local LLMs
|
||||
Setting up MemGPT to run with local LLMs requires a bit more setup, follow [the instructions here](../local_llm).
|
||||
## Local Models & Custom Endpoints
|
||||
MemGPT supports running open source models, both being run locally or as a hosted service. Setting up MemGPT to run with open models requires a bit more setup, follow [the instructions here](../local_llm).
|
||||
|
||||
@@ -99,6 +99,7 @@ Currently, MemGPT supports the following backends:
|
||||
* [LM Studio](../lmstudio) (Mac, Windows) (❌ does not support grammars)
|
||||
* [koboldcpp](../koboldcpp) (Mac, Windows, Linux) (✔️ supports grammars)
|
||||
* [llama.cpp](../llamacpp) (Mac, Windows, Linux) (✔️ supports grammars)
|
||||
* [vllm](../vllm) (Mac, Windows, Linux) (❌ does not support grammars)
|
||||
|
||||
If you would like us to support a new backend, feel free to open an issue or pull request on [the MemGPT GitHub page](https://github.com/cpacker/MemGPT)!
|
||||
|
||||
|
||||
@@ -3,3 +3,4 @@ furo
|
||||
myst-parser
|
||||
mkdocs
|
||||
mkdocs-material
|
||||
pymdown-extensions
|
||||
|
||||
@@ -13,12 +13,13 @@ nav:
|
||||
- 'Example - chat with your data': example_data.md
|
||||
- 'Configuration': config.md
|
||||
- 'External data sources': data_sources.md
|
||||
- 'Changing the LLM backend': endpoints.md
|
||||
- 'Configuring LLMs': endpoints.md
|
||||
- 'Configuring embeddings': embedding_endpoints.md
|
||||
- 'FAQ': cli_faq.md
|
||||
- 'Discord Bot':
|
||||
- 'Chatting with MemGPT Bot': discord_bot.md
|
||||
- 'Local LLM':
|
||||
- 'MemGPT + local LLMs': local_llm.md
|
||||
- 'LLM Backends':
|
||||
- 'MemGPT + open models': local_llm.md
|
||||
- 'oobabooga web UI': webui.md
|
||||
# - 'oobabooga web UI (on RunPod)': webui_runpod.md
|
||||
- 'LM Studio': lmstudio.md
|
||||
@@ -37,7 +38,7 @@ nav:
|
||||
- 'Adding support for new LLMs': adding_wrappers.md
|
||||
- 'Contributing to the codebase': contributing.md
|
||||
theme:
|
||||
name: material
|
||||
name: material
|
||||
features:
|
||||
- announce.dismiss
|
||||
- content.action.edit
|
||||
|
||||
Reference in New Issue
Block a user