-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Move some gifs to the assets s3 bucket and add instructions for Ollam…
…a, shareable conversations
- Loading branch information
Showing
9 changed files
with
49 additions
and
4 deletions.
There are no files selected for viewing
Binary file not shown.
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
# Shareable Chat | ||
|
||
You can share any of your conversations by going to the three dot menu on the conversation and selecting 'Share'. This will create a **public** link that you can share with anyone. The link will open the conversation in the same state it was when you shared it, so your future messages will not be visible to the person you shared it with. | ||
|
||
This means you can easily share a conversation with someone to show them how you solved a problem, or to get help with something you're working on. | ||
|
||
![demo of sharing a conversation](https://assets.khoj.dev/shareable_conversations.gif) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,33 @@ | ||
# Ollama / Khoj | ||
|
||
You can run your own open source models locally with Ollama and use them with Khoj. | ||
|
||
:::info[Ollama Integration] | ||
This is only going to be helpful for self-hosted users. If you're using [Khoj Cloud](https://app.khoj.dev), you're limited to our first-party models. | ||
::: | ||
|
||
Khoj supports any OpenAI-API compatible server, which includes [Ollama](http://ollama.ai/). Ollama allows you to start a local server with [several popular open-source LLMs](https://ollama.com/library) directly on your own computer. Combined with Khoj, you can chat with these LLMs and use them to search your notes and documents. | ||
|
||
While Khoj also supports local-hosted LLMs downloaded from Hugging Face, the Ollama integration is particularly useful for its ease of setup and multi-model support, especially if you're already using Ollama. | ||
|
||
## Setup | ||
|
||
1. Setup Ollama: https://ollama.com/ | ||
2. Start your preferred model with Ollama. For example, | ||
```bash | ||
ollama run llama3 | ||
``` | ||
3. Go to Khoj settings at [OpenAI Processor Conversation Config](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/) | ||
4. Create a new config. | ||
- Name: `ollama` | ||
- Api Key: `any string` | ||
- Api Base Url: `http://localhost:11434/v1/` (default for Ollama) | ||
5. Go to [Chat Model Options](http://localhost:42110/server/admin/database/chatmodeloptions/) | ||
6. Create a new config. | ||
- Name: `llama3` (replace with the name of your local model) | ||
- Model Type: `Openai` | ||
- Openai Config: `<the ollama config you created in step 4>` | ||
- Max prompt size: `1000` (replace with the max prompt size of your model) | ||
7. Go to [your config](http://localhost:42110/config) and select the model you just created in the chat model dropdown. | ||
|
||
That's it! You should now be able to chat with your Ollama model from Khoj. If you want to add additional models running on Ollama, repeat step 6 for each model. |