* mark depricated API section * CLI bug fixes for azure * check azure before running * Update README.md * Update README.md * bug fix with persona loading * remove print * make errors for cli flags more clear * format * fix imports * fix imports * add prints * update lock * update config fields * cleanup config loading * commit * remove asserts * refactor configure * put into different functions * add embedding default * pass in config * fixes * allow overriding openai embedding endpoint * black * trying to patch tests (some circular import errors) * update flags and docs * patched support for local llms using endpoint and endpoint type passed via configs, not env vars * missing files * fix naming * fix import * fix two runtime errors * patch ollama typo, move ollama model question pre-wrapper, modify question phrasing to include link to readthedocs, also have a default ollama model that has a tag included * disable debug messages * made error message for failed load more informative * don't print dynamic linking function warning unless --debug * updated tests to work with new cli workflow (disabled openai config test for now) * added skips for tests when vars are missing * update bad arg * revise test to soft pass on empty string too * don't run configure twice * extend timeout (try to pass against nltk download) * update defaults * typo with endpoint type default * patch runtime errors for when model is None * catching another case of 'x in model' when model is None (preemptively) * allow overrides to local llm related config params * made model wrapper selection from a list vs raw input * update test for select instead of input * Fixed bug in endpoint when using local->openai selection, also added validation loop to manual endpoint entry * updated error messages to be more informative with links to readthedocs * add back gpt3.5-turbo --------- Co-authored-by: cpacker <packercharles@gmail.com>
15 lines
464 B
Python
15 lines
464 B
Python
import memgpt.local_llm.llm_chat_completion_wrappers.airoboros as airoboros
|
|
|
|
DEFAULT_ENDPOINTS = {
|
|
"koboldcpp": "http://localhost:5001",
|
|
"llamacpp": "http://localhost:8080",
|
|
"lmstudio": "http://localhost:1234",
|
|
"ollama": "http://localhost:11434",
|
|
"webui": "http://localhost:5000",
|
|
}
|
|
|
|
DEFAULT_OLLAMA_MODEL = "dolphin2.2-mistral:7b-q6_K"
|
|
|
|
DEFAULT_WRAPPER = airoboros.Airoboros21InnerMonologueWrapper
|
|
DEFAULT_WRAPPER_NAME = "airoboros-l2-70b-2.1"
|