From 262b1b4f7da5a96bdb3f8fee177ccfb8edb5d4d3 Mon Sep 17 00:00:00 2001 From: SaneGaming Date: Mon, 1 Jan 2024 00:44:48 -0600 Subject: [PATCH 1/6] Include steps for Local LLMs Added install instructions if running LLMs locally. --- docs/quickstart.md | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/docs/quickstart.md b/docs/quickstart.md index dad5bde808..f64377cc37 100644 --- a/docs/quickstart.md +++ b/docs/quickstart.md @@ -11,6 +11,11 @@ To install MemGPT, make sure you have Python installed on your computer, then ru pip install pymemgpt ``` +If you are running LLMs locally, you will want to install MemGPT with the local dependencies by running: +```sh +pip install pymemgpt[local] +``` + If you already have MemGPT installed, you can update to the latest version with: ```sh pip install pymemgpt -U From 8a0b9c0d2f35f2099b41a300e301b8e56e27ba6d Mon Sep 17 00:00:00 2001 From: SaneGaming Date: Mon, 1 Jan 2024 01:46:41 -0600 Subject: [PATCH 2/6] Add Windows warning --- docs/local_llm.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/local_llm.md b/docs/local_llm.md index e72287cfa3..bbf41ae936 100644 --- a/docs/local_llm.md +++ b/docs/local_llm.md @@ -6,15 +6,15 @@ category: 6580da9a40bb410016b8b0c3 > 📘 Need help? > -> If you need help visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel. +> Visit our [Discord server](https://discord.gg/9GEQrxmVyE) and post in the #support channel. Make sure to check the [local LLM troubleshooting page](local_llm_faq) to see common issues before raising a new issue or posting on Discord. + +> 📘 Using Windows? > -> You can also check the [GitHub discussion page](https://github.com/cpacker/MemGPT/discussions/67), but the Discord server is the official support channel and is monitored more actively. +> If you're using Windows and are trying to get MemGPT with local LLMs setup, we recommend using Anaconda Shell, or WSL (for more advanced users). See more Windows installation tips [here](local_llm_faq). > ⚠️ MemGPT + open LLM failure cases > -> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls. -> -> Make sure to check the [local LLM troubleshooting page](local_llm_faq) to see common issues before raising a new issue or posting on Discord. +> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls. See [LINK TO FAQ] for more information. ### Installing dependencies To install dependencies required for running local models, run: From 6f5bc9020fcde443c933e2c6c1e6c364a5b401bd Mon Sep 17 00:00:00 2001 From: SaneGaming Date: Mon, 1 Jan 2024 01:53:55 -0600 Subject: [PATCH 3/6] Update installation warning for Local LLMs Remove exact install instructions to keep page clean for QuickStart and not duplicating knowledge. --- docs/quickstart.md | 8 +++----- 1 file changed, 3 insertions(+), 5 deletions(-) diff --git a/docs/quickstart.md b/docs/quickstart.md index f64377cc37..9cc10d4051 100644 --- a/docs/quickstart.md +++ b/docs/quickstart.md @@ -5,17 +5,15 @@ category: 6580d34ee5e4d00068bf2a1d --- ### Installation +> 📘 Using Local LLMs? +> +> If you're using local LLMs refer to the MemGPT + open models page [here](local_llm) for additional installation requirements. To install MemGPT, make sure you have Python installed on your computer, then run: ```sh pip install pymemgpt ``` -If you are running LLMs locally, you will want to install MemGPT with the local dependencies by running: -```sh -pip install pymemgpt[local] -``` - If you already have MemGPT installed, you can update to the latest version with: ```sh pip install pymemgpt -U From cb5925d3fab6ecbdf2d29dee248042e99cd2a351 Mon Sep 17 00:00:00 2001 From: SaneGaming Date: Mon, 1 Jan 2024 02:05:09 -0600 Subject: [PATCH 4/6] Update local_llm_faq.md Added WSL troubleshooting section. --- docs/local_llm_faq.md | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/docs/local_llm_faq.md b/docs/local_llm_faq.md index d77c1cd32a..c5aaa8e9b9 100644 --- a/docs/local_llm_faq.md +++ b/docs/local_llm_faq.md @@ -71,3 +71,19 @@ This string is not correct JSON - it is missing closing brackets and has a stray ### "Got back an empty response string from ..." MemGPT asked the server to run the LLM, but got back an empty response. Double-check that your server is running properly and has context length set correctly (it should be set to 8k if using Mistral 7B models). + +### "Unable to connect to endpoint" using Windows + WSL + +>⚠️ We recommend using Anaconda Shell, as WSL has been known to have issues passing network traffic between WSL and the windows host. + +If you still would like to try WSL, you must be on WSL version 2.0.5 or above with the installation from the Microsoft Store app. +You will need to verify your WSL network mode is set to "mirrored" + +You can do this by checking the `.wslconfig` file in `%USERPROFILE%' + +Add the following if the file does not contain: +``` +[wsl2] +networkingMode=mirrored # add this line if the wsl2 section already exists +``` + From eeec52754539ca4b48584f89fbf6fba7a0e0c6b9 Mon Sep 17 00:00:00 2001 From: SaneGaming Date: Tue, 2 Jan 2024 09:01:02 -0600 Subject: [PATCH 5/6] Update local_llm.md Update FAQ Link wording --- docs/local_llm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/local_llm.md b/docs/local_llm.md index bbf41ae936..655fba204c 100644 --- a/docs/local_llm.md +++ b/docs/local_llm.md @@ -14,7 +14,7 @@ category: 6580da9a40bb410016b8b0c3 > ⚠️ MemGPT + open LLM failure cases > -> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls. See [LINK TO FAQ] for more information. +> When using open LLMs with MemGPT, **the main failure case will be your LLM outputting a string that cannot be understood by MemGPT**. MemGPT uses function calling to manage memory (eg `edit_core_memory(...)` and interact with the user (`send_message(...)`), so your LLM needs generate outputs that can be parsed into MemGPT function calls. See [the local LLM troubleshooting page](local_llm_faq) for more information. ### Installing dependencies To install dependencies required for running local models, run: From 404b0582715c569cfacff0ef02754ef8d26c1606 Mon Sep 17 00:00:00 2001 From: SaneGaming Date: Tue, 2 Jan 2024 09:04:44 -0600 Subject: [PATCH 6/6] Update local_llm_faq.md Improve punctuation and add link to WSL Issue thread --- docs/local_llm_faq.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/local_llm_faq.md b/docs/local_llm_faq.md index c5aaa8e9b9..7a474b0725 100644 --- a/docs/local_llm_faq.md +++ b/docs/local_llm_faq.md @@ -74,7 +74,8 @@ MemGPT asked the server to run the LLM, but got back an empty response. Double-c ### "Unable to connect to endpoint" using Windows + WSL ->⚠️ We recommend using Anaconda Shell, as WSL has been known to have issues passing network traffic between WSL and the windows host. +>⚠️ We recommend using Anaconda Shell, as WSL has been known to have issues passing network traffic between WSL and the Windows host. +> Check the [WSL Issue Thread](https://github.com/microsoft/WSL/issues/5211) for more info. If you still would like to try WSL, you must be on WSL version 2.0.5 or above with the installation from the Microsoft Store app. You will need to verify your WSL network mode is set to "mirrored"