Skip to content

Commit

Permalink
Update raycast-ollama extension
Browse files Browse the repository at this point in the history
- Merge pull request raycast#9 from MassimilianoPasquini97/dev
- Updated README.md and CHANGELOG.md files.
- Load PDF and Text based files on prompt for query chains.
- Code splitted
- Merge pull request raycast#6 from MassimilianoPasquini97/OllamaResponseApiFix
- Updated changelog.md
- TextField apper if ModelsOnRegistry is undefined
- Fixed ModelsOnRegistry.lengh undefined
- Deleted \'error\' message on event emitter
- Deleted field no longher used by Ollama Generate Response API
- CHANGELOG.md update
- Merge pull request raycast#5 from MassimilianoPasquini97/clipboard_fallback
- New Preference \'Enable Input Source Fallback\'.
- New Preference \'Input Source\'
- Removed unused embedding from codebase
- Merge branch \'contributions/merge-1697691587951609000\'
- Pull contributions
- Updated CHANGELOG.md
- Updated README.md with new model name.
- Metrics metadata now available on Chat Command.
- Ollama Host is now configurable throw Preferences.
- New Action.Open for quickly go to \'Manage Models\'.
- Moved Model preferences to LocalStorage.
- Updated Models Library link on README.md
- Reduced re-rendering on models downloading.
- Deleted navigationTitle from Form.
- Last fixes before publish.
- Improvement on \'Chat With Ollama\' ActionPanel
- New command \'Manage Models\'
- OllamaApiTags() function returns Promise<OllamaApiTagsResponse>
- Error Handling for \'ollama-custom-create\'
- Implemented new command \'Create Custom Command\'
- Multiple chat saving feature
- Convertation is now saved only when inference is done.
- Chat is now saved on LocalStorage
- First implementation of a chat command.
- [Improvement and BugFix] - 2023-08-12
- Merge pull request raycast#2 from suhaildawood/main
- Changed CHANGELOG.md file.
- feat: support for llama2:70b
- Import optimized images
- Pull contributions
- CHANGELOG Update
- Updated README and minor fix
- Fixed CHANGELOG and README
- [Improvement] - 2023-07-31
- Improvement] - 2023-07-30 v2
- [Improvement] - 2023-07-30
- [Code Improvement and BugFix] - 2023-07-29
- ray lint --fix runned
- Added git repository
- Initial commit
- Initial commit
  • Loading branch information
MassimilianoPasquini97 committed Nov 30, 2023
1 parent d59a98e commit 31859ab
Show file tree
Hide file tree
Showing 32 changed files with 4,567 additions and 1,044 deletions.
5 changes: 5 additions & 0 deletions extensions/raycast-ollama/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# raycast-ollama Changelog

## [Improvement] - 2023-11-30

- [Improvement] Query you pdf or text based file with Ollama. More information on how to use is on README.md.
- [Improvement] On Command 'Manage Models' is now possible to view all Modelfile parameters. If a specific parameter isn't set on Modelfile it display the default value.

## [BugFix] - 2023-11-5

- [BugFix] Fixed error `ModelsOnRegistry.lengh undefined`.
Expand Down
26 changes: 15 additions & 11 deletions extensions/raycast-ollama/README.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,25 @@
# Raycast Ollama
<p align="center">
<img src="assets/icon.png" height="128">
<h1 align="center">Raycast Ollama</h1>
</p>

Use [Ollama](https://ollama.ai) for local llama inference on Raycast.

## Requirements

1. Ollama installed and running.
2. At least one model installed. Use 'Manage Models' commands for pulling images or ollama cli.
[Ollama](https://ollama.ai) installed and running on your mac. At least one model need to be installed throw Ollama cli tools or with 'Manage Models' Command. You can find all available model [here](https://ollama.ai/library).

```bash
ollama pull orca-mini
ollama pull llama2
```
## How to Use

## Use a different model
### Command: Chat With Ollama

This plugin allows you to select a different model for each command. Keep in mind that you need to have the corresponding model installed on your machine. You can find all available model [here](https://ollama.ai/library).
Chat with your preferred model from Raycast, with the following features:

## Create your own custom commands
- Save conversation with `CMD+S` keyboard shortcut. You can access your saved conversation with `CMD+P` keyboard shortcut.
- Change model with `CMD+M` keyboard shortcuts. For embedding is recommended to use a lower parameters model for better performance.
- Copy your Question, Answer or even the entire Chat to the clipboard.
- Is now possible to ask the model about one or more files. Select the files using `CMD+F`, at this stage only text based files and PDF are supported. And finally use tag `/file` for query about selected files. By default it use 'Stuff' Chain, you can change Chain type from 'Document Loader' submenu. This feature is currently experimental.

With '***Create Custom Command***' you can create your own custom command or chatbot using whatever model you want.
### Command: Create Custom Commands

With '***Create Custom Command***' you can create your own custom Command using whatever model you want.
Loading

0 comments on commit 31859ab

Please sign in to comment.