Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added Ollama integration #22

Merged
merged 17 commits into from
Oct 2, 2024
Merged

Added Ollama integration #22

merged 17 commits into from
Oct 2, 2024

Conversation

w1gs
Copy link
Contributor

@w1gs w1gs commented Sep 18, 2024

This PR adds in the Ollama integration. Two new settings were added for Ollama (endpoint and model). Instead of using the Ollama node library a fetch request directly to the provided endpoint is used. A local instance of the Ollama API can be started with the command ollama serve. The OLLAMA_ORIGINS=* environment variable needs to be set to allow the extension to make requests to Ollama.

@andrewpareles
Copy link
Contributor

Nice! Will test this in a bit. It would be great to add setup instructions for this, just added #26.

@w1gs
Copy link
Contributor Author

w1gs commented Sep 18, 2024

Nice! Will test this in a bit. It would be great to add setup instructions for this, just added #26.

Awesome. Where would be the best place to add the instructions to set that up?

@mathewpareles
Copy link
Contributor

mathewpareles commented Sep 19, 2024

It should probably appear in a new window when VS Code starts up, similar to the "Welcome" page. We're also open to alternatives.
image

@okxiaoliang4 okxiaoliang4 mentioned this pull request Sep 20, 2024
5 tasks
Copy link
Contributor

@BruceMacD BruceMacD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for doing this, I was just trying a similar approach myself.

Some notes for anyone that is looking to try this:

  • Ollama doesn't allow CORs requests from VSCode extensions by default. So you'll have to stop any running Ollama server and manually set it to allow the extension origin, as w1gs documented.
    OLLAMA_ORIGINS="vscode-webview://*" ollama serve

Co-authored-by: Bruce MacDonald <brucewmacdonald@gmail.com>
@andrewpareles
Copy link
Contributor

andrewpareles commented Sep 21, 2024

  1. @BruceMacD Great call. @w1gs, we should definitely use ollama/browser instead of calling fetch(). This should be easy to swap out using Bruce's code. My only change would be to make it use .try().catch() to go with the convention of the rest of the file.
import ollama from 'ollama/browser';

const sendOllamaMsg: SendLLMMessageFnTypeInternal = ({ messages, onText, onFinalMessage, apiConfig }) => {
	const ollama = new Ollama({ host: apiConfig.ollama.host });

	let did_abort = false
	let fullText = ''

	// if abort is called, onFinalMessage is NOT called, and no later onTexts are called either
	let abort: () => void = () => { did_abort = true }

	ollama.chat({ model: apiConfig.ollama.model, messages: messages, stream: true })
		.then(async response => {

			abort = () => {
				ollama.abort();
				did_abort = true;
			}

			try {
				for await (const part of response) {
					if (did_abort) return
					let newText = part.message.content
					fullText += newText
					onText(newText, fullText)
				}
			}
			// when error/fail
			catch (e) {
				onFinalMessage(fullText)
				return
			}

			// when we get the final message on this stream
			onFinalMessage(fullText)
		})

	return { abort };
}
  1. I'm not sure if there's a way to specify the Ollama endpoint in ollama/browser. If there's not, I'd be fine getting rid of the apiConfig.ollama.model variable and assuming users want to host on Ollama's default url.

@w1gs
Copy link
Contributor Author

w1gs commented Sep 21, 2024

@andrewpareles I was able to get it setup with ollama/browser and it works great. I also made some changes that will detect if there aren't any API keys set, and set the extension to disabled. When disabled it will be blurred out and not interactable. If you try to send cmd+l when no keys are set, a warning box will appear. As the configuration updates so does the extension. I do agree that the Ollama setup instructions should be somewhere but I figured a simple enable/disable mechanism would be good for now.

Disabled extension:
CleanShot 2024-09-21 at 07 01 11@2x

Warning when trying to add text when no keys are set:
CleanShot 2024-09-21 at 07 02 18@2x

Extension enabled:
CleanShot 2024-09-21 at 07 03 21@2x

@andrewpareles
Copy link
Contributor

I'll review this in a bit!

@andrewpareles
Copy link
Contributor

Before closing this, see #74

@w1gs
Copy link
Contributor Author

w1gs commented Oct 1, 2024

Fixed the merge conflicts.

@andrewpareles andrewpareles merged commit 653d5e9 into voideditor:main Oct 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants