Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
47 changes: 47 additions & 0 deletions docs/source/deployment/frameworks/anything-llm.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
(deployment-anything-llm)=

# Anything LLM

[Anything LLM](https://github.com/Mintplex-Labs/anything-llm) is a full-stack application that enables you to turn any document, resource, or piece of content into context that any LLM can use as references during chatting.

It allows you to deploy a large language model (LLM) server with vLLM as the backend, which exposes OpenAI-compatible endpoints.

## Prerequisites

- Setup vLLM environment

## Deploy

- Start the vLLM server with the supported chat completion model, e.g.

```console
vllm serve Qwen/Qwen1.5-32B-Chat-AWQ --max-model-len 4096
```

- Download and install [Anything LLM desktop](https://anythingllm.com/desktop).

- On the bottom left of open settings, AI Prooviders --> LLM:
- LLM Provider: Generic OpenAI
- Base URL: http://{vllm server host}:{vllm server port}/v1
- Chat Model Name: `Qwen/Qwen1.5-32B-Chat-AWQ`

:::{image} /assets/deployment/anything-llm-provider.png
:::

- Back to home page, New Workspace --> create `vllm` workspace, and start to chat:

:::{image} /assets/deployment/anything-llm-chat-without-doc.png
:::

- Click the upload button:
- upload the doc
- select the doc and move to the workspace
- save and embed

:::{image} /assets/deployment/anything-llm-upload-doc.png
:::

- Chat again:

:::{image} /assets/deployment/anything-llm-chat-with-doc.png
:::
1 change: 1 addition & 0 deletions docs/source/deployment/frameworks/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
:::{toctree}
:maxdepth: 1

anything-llm
bentoml
cerebrium
dstack
Expand Down