Commit Graph

6949 Commits

Author SHA1 Message Date
Vivian Fang
87fc69baae Better error message printing for function call failing (#291)
* Better error message printing for function call failing

* only one import traceback

* don't forward entire stack trace to memgpt
2023-11-06 15:32:58 -08:00
Vivian Fang
4edba17419 Better interface output for function calls (#296)
Co-authored-by: Charles Packer <packercharles@gmail.com>
2023-11-06 15:21:30 -08:00
Charles Packer
fe2d8b2b2f add ollama support (#314)
* untested

* patch

* updated

* clarified using tags in docs

* tested ollama, working

* fixed template issue by creating dummy template, also added missing context length indicator

* moved count_tokens to utils.py

* clean
2023-11-06 15:11:22 -08:00
Sarah Wooders
cb12f7043b Add memgpt version command and package version (#336) 2023-11-06 13:38:50 -08:00
Sarah Wooders
d9f435b405 Update quickstart.md to show flag list properly 2023-11-06 13:09:38 -08:00
Vivian Fang
733ad56a91 Add autogen+localllm docs (#335)
Co-authored-by: Jirito0 <jirito0@users.noreply.github.com>
2023-11-06 13:08:12 -08:00
Sarah Wooders
6217234942 Fix README local LLM link 2023-11-06 13:05:24 -08:00
Sarah Wooders
1595897fb8 Remove redundant docs from README (#334) 2023-11-06 13:01:32 -08:00
Hans Raaf
a9e91e120f Stopping the app to repeat the user message in normal use. (#304)
- Removed repeating every user message like bein in debug mode
- Re-added the "dump" flag for the user message, to make it look nicer.
  I may "reformat" other message too when dumping, but that was what
  sticked out to me as unpleasant.
2023-11-06 12:58:28 -08:00
Charles Packer
5ac8635446 cleanup #326 (#333) 2023-11-06 12:57:19 -08:00
borewik
dbbb3fc14b Update chat_completion_proxy.py (#326)
grammar_name Has to be defined, if not there's an issue with line 92
2023-11-06 12:53:17 -08:00
Charles Packer
f5e6497668 patch in-chat command info (#332) 2023-11-06 12:50:27 -08:00
Charles Packer
caba2f468c Create docs pages (#328)
* Create docs  (#323)

* Create .readthedocs.yaml

* Update mkdocs.yml

* update

* revise

* syntax

* syntax

* syntax

* syntax

* revise

* revise

* spacing

* Docs (#327)

* add stuff

* patch homepage

* more docs

* updated

* updated

* refresh

* refresh

* refresh

* update

* refresh

* refresh

* refresh

* refresh

* missing file

* refresh

* refresh

* refresh

* refresh

* fix black

* refresh

* refresh

* refresh

* refresh

* add readme for just the docs

* Update README.md

* add more data loading docs

* cleanup data sources

* refresh

* revised

* add search

* make prettier

* revised

* updated

* refresh

* favi

* updated

---------

Co-authored-by: Sarah Wooders <sarahwooders@gmail.com>
2023-11-06 12:38:49 -08:00
Charles Packer
cc1ce0ce33 Remove embeddings as argument in archival_memory.insert (#284) 2023-11-05 12:48:22 -08:00
Sarah Wooders
d9b9ad4860 Fix formatting in README.md 2023-11-05 11:18:38 -08:00
Sarah Wooders
3fd9f1b8e4 Fix: imported wrong storage connector (#320) 2023-11-05 10:19:33 -08:00
Dividor
b65972a1dc Aligned code with README that environment variable for Azure embeddings should be AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT (#308) 2023-11-05 10:01:02 -08:00
Sarah Wooders
d2afc1e86f Don't import postgres storage if not specified in config (#318) 2023-11-05 09:52:18 -08:00
Robin Goetz
ca0ad1ecc1 fix: import PostgresStorageConnector only if postgres is selected as storage type (#310) 2023-11-05 09:49:05 -08:00
Vivian Fang
f18429c416 Bump version to 0.1.18-alpha.1 2023-11-04 12:08:25 -07:00
Charles Packer
e90c00ad63 Add grammar-based sampling (for webui, llamacpp, and koboldcpp) (#293)
* add llamacpp server support

* use gbnf loader

* cleanup and warning about grammar when not using llama.cpp

* added memgpt-specific grammar file

* add grammar support to webui api calls

* black

* typo

* add koboldcpp support

* no more defaulting to webui, should error out instead

* fix grammar

* patch kobold (testing, now working) + cleanup log messages

Co-Authored-By: Drake-AI <drake-ai@users.noreply.github.com>
2023-11-04 12:02:44 -07:00
danx0r
2f56e0eaf5 FIx #261 (#300)
* should fix issue 261 - pickle fail on DotDict class

* black patch

---------

Co-authored-by: cpacker <packercharles@gmail.com>
2023-11-03 23:33:59 -07:00
Charles Packer
2d57564c35 make timezone local by default (#298)
Co-authored-by: orderwat <github@oderwat.de>
2023-11-03 21:15:15 -07:00
Hans Raaf
71d696dc9e I added a "/retry" command to retry for getting another answer. (#188)
* I added a "/retry" command to retry for getting another answer.

- Implemented to pop messages until hitting the last user message. Then
  extracting the users last message and sending it again. This will also
  work with state files and after manually popping messages.
- Updated the README to include /retry
- Update the README for "pop" with parameter and changed default to 3 as
  this will pop "function/assistant/user" which is the usual turn
  around.

* disclaimer

---------

Co-authored-by: Charles Packer <packercharles@gmail.com>
2023-11-03 21:04:37 -07:00
Hans Raaf
dcdfa04fc0 I added commands to shape the conversation: (#218)
* I added commands to shape the conversation:

`/rethink <text>` will change the internal dialog of the last assistant message.
`/rewrite <text>` will change the last answer of the assistant.

Both commands can be used to change how the conversation continues in
some pretty drastic and powerfull ways.

* remove magic numbers

* add disclaimer

---------

Co-authored-by: cpacker <packercharles@gmail.com>
2023-11-03 20:57:43 -07:00
Hans Raaf
9189a7bf26 I made dump showing more messages and added a count (the last x) (#204)
* I made dump showing more message and added a count (the last x)

There seem to be some changes about the implementation so that the
current dump message helper functions do not show a lot of useful info.

I changed it so that you can `dump 5` (last 5 messages) and that it will
print user readable output. This lets you get some more understanding about
what is going on.

As some messages are still not shown I also show the index (reverse) of the
printed message, so one can see what to "pop" to reach a special point
without geting into the drumpraw.

* black

* patch

---------

Co-authored-by: Charles Packer <packercharles@gmail.com>
2023-11-03 20:47:23 -07:00
Charles Packer
94893b4bd5 try to patch hanging test (#295)
* try to patch hanging test

* add a timeout on the test
2023-11-03 19:11:29 -07:00
Vivian Fang
1871823c99 hotfix DummyArchivalMemoryWithFaiss 2023-11-03 16:41:06 -07:00
cpacker
e0ecd43d96 hotfix 2023-11-03 16:25:39 -07:00
Sarah Wooders
b9ce763fda VectorDB support (pgvector) for archival memory (#226) 2023-11-03 16:19:15 -07:00
Sarah Wooders
c1fd8d6df1 Make CLI agent flag errors more clear, and dont throw error if flags dont contradict existing agent config (#290) 2023-11-03 14:13:44 -07:00
Charles Packer
25dd225d04 strip '/' and use osp.join (Windows support) (#283)
* strip '/' and use osp.join

* grepped for MEMGPT_DIR, found more places to replace '/'

* typo

* grep pass over filesep

---------

Co-authored-by: Vivian Fang <hi@vivi.sh>
2023-11-03 13:54:29 -07:00
Sarah Wooders
62c1128252 Don't prompt for selecting existing agent if there is a --persona/human/model flag (#289) 2023-11-03 12:45:22 -07:00
Charles Packer
6b4008c72e more stop tokens (#288) 2023-11-03 12:25:37 -07:00
tractorjuice
908e6d2dcd Update openai_tools.py to use delay (#159)
* Update openai_tools.py

Updated to use the 'delay'

* also use delay instead of 62 in async completions
2023-11-03 09:46:40 -07:00
Sarah Wooders
3fead07dc2 Fix --data-source in README 2023-11-03 08:55:17 -07:00
Vivian Fang
81f33f94a0 Bump version to 0.1.17 2023-11-03 00:21:08 -07:00
Charles Packer
437306388f Improvements to JSON handling for local LLMs (#269)
* some extra json hacks

* add 'smart' json loader to other wrapers

* added chatml related stop tokens by default
2023-11-03 00:18:31 -07:00
Charles Packer
fde0087a19 Patch summarize when running with local llms (#213)
* trying to patch summarize when running with local llms

* moved token magic numbers to constants, made special localllm exception class (TODO catch these for retry), fix summarize bug where it exits early if empty list

* missing file

* raise an exception on no-op summary

* changed summarization logic to walk forwards in list until fraction of tokens in buffer is reached

* added same diff to sync agent

* reverted default max tokens to 8k, cleanup + more error wrapping for better error messages that get caught on retry

* patch for web UI context limit error propogation, using best guess for what the web UI error message is

* add webui token length exception

* remove print

* make no wrapper warning only pop up once

* cleanup

* Add errors to other wrappers

---------

Co-authored-by: Vivian Fang <hi@vivi.sh>
2023-11-02 23:44:02 -07:00
Bozhao
4789027c06 Fix typo in system base prompt (#189) 2023-11-02 23:37:52 -07:00
cpacker
c2c4b87342 cleanup 2023-11-02 22:37:17 -07:00
Charles Packer
ddc510306d typos (#268) 2023-11-02 22:33:25 -07:00
Vivian Fang
6f188cffc4 Allow loading in a directory non-recursively (#246) 2023-11-02 10:04:01 -07:00
Robin Goetz
30bb866142 fix: LocalArchivalMemory prints ref_doc_info on if not using EmptyIndex (#240)
Currently, if you run the /memory command the application breaks if the LocalArchivalMemory
has no existing archival storage and defaults to the EmptyIndex. This is caused by EmptyIndex
not having a ref_doc_info implementation and throwing an Exception when that is used to print
the memory information to the console. This hot fix simply makes sure that we do not try to
use the function if using EmptyIndex and instead prints a message to the console indicating
an EmptyIndex is used.
2023-11-01 18:45:04 -07:00
Vivian Fang
94fbb76596 Update README.md 2023-11-01 18:37:36 -07:00
Vivian Fang
b9c229de35 Update README.md 2023-11-01 18:32:19 -07:00
Vivian Fang
467ec5537e Typo in interface 2023-11-01 18:12:49 -07:00
Charles Packer
77fd987f2a Add basic tests that are run on PR/main (#228)
* make tests dummy to make sure github workflow is fine

* black test

* strip circular import

* further dummy-fy the test

* use pexpect

* need y

* Update tests.yml

* Update tests.yml

* added prints

* sleep before decode print

* updated test to match legacy flow

* revising test where it fails

* comment out enter your message check for now, pexpect seems to be stuck on only setting the bootup message

* weird now it's not showing Bootup sequence complete?

* added debug

* handle none

* allow more time

* loosen string check

* add enter after commands

* modify saved compontent snippet

* add try again check

* more sendlines

* more excepts

* test passing locally

* Update tests.yml

* dont clearline

* add EOF catch that seems to only happen on github actiosn (ubuntu) but not macos

* more eof

* try flushing

* add strip_ui flag

* fix archival_memory_search and memory print output

* Don't use questionary for input if strip_ui

* Run black

* Always strip UI if TEST is set

* Add another flush

* expect Enter your message

* more debug prints

* one more shot at printing debug info

* stray fore color in stripped ui

* tests pass locally

* cleanup

---------

Co-authored-by: Vivian Fang <hi@vivi.sh>
2023-11-01 17:01:45 -07:00
Charles Packer
438d8e8cee Update README.md 2023-11-01 16:46:08 -07:00
Vivian Fang
d7d6fbc42a Run tests on PRs to main 2023-11-01 13:57:21 -07:00