MemGPT over LlamaIndex API Docs
MemGPT enables you to chat with your data -- try running this example to talk to the LlamaIndex API docs!
-
a. Download embeddings and docs index from HuggingFace.
# Make sure you have git-lfs installed (https://git-lfs.com) git lfs install git clone https://huggingface.co/datasets/MemGPT/llamaindex-api-docs-- OR --
b. Build the index:
- Build llama_index API docs with
make text. Instructions here. Copy over the generated_build/textfolder to this directory. - Generate embeddings and FAISS index.
python3 scrape_docs.py python3 generate_embeddings_for_docs.py all_docs.jsonl python3 build_index.py --embedding_files all_docs.embeddings.jsonl --output_index_file all_docs.index
- Build llama_index API docs with
-
In the root
MemGPTdirectory, runpython3 main.py --archival_storage_faiss_path=<ARCHIVAL_STORAGE_FAISS_PATH> --persona=memgpt_doc --human=basicwhere
ARCHIVAL_STORAGE_FAISS_PATHis the directory whereall_docs.jsonlandall_docs.indexare located. If you downloaded from HuggingFace, it will bememgpt/personas/docqa/llamaindex-api-docs.
