-
Notifications
You must be signed in to change notification settings - Fork 585
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added Ollama integration #22
Conversation
Nice! Will test this in a bit. It would be great to add setup instructions for this, just added #26. |
Awesome. Where would be the best place to add the instructions to set that up? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for doing this, I was just trying a similar approach myself.
Some notes for anyone that is looking to try this:
- Ollama doesn't allow CORs requests from VSCode extensions by default. So you'll have to stop any running Ollama server and manually set it to allow the extension origin, as w1gs documented.
OLLAMA_ORIGINS="vscode-webview://*" ollama serve
Co-authored-by: Bruce MacDonald <brucewmacdonald@gmail.com>
|
@andrewpareles I was able to get it setup with ollama/browser and it works great. I also made some changes that will detect if there aren't any API keys set, and set the extension to disabled. When disabled it will be blurred out and not interactable. If you try to send cmd+l when no keys are set, a warning box will appear. As the configuration updates so does the extension. I do agree that the Ollama setup instructions should be somewhere but I figured a simple enable/disable mechanism would be good for now. |
I'll review this in a bit! |
Before closing this, see #74 |
Fixed the merge conflicts. |
This PR adds in the Ollama integration. Two new settings were added for Ollama (endpoint and model). Instead of using the Ollama node library a fetch request directly to the provided endpoint is used. A local instance of the Ollama API can be started with the command
ollama serve
. TheOLLAMA_ORIGINS=*
environment variable needs to be set to allow the extension to make requests to Ollama.