-
Notifications
You must be signed in to change notification settings - Fork 585
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use this Api -- Ollama #74
Comments
Seems similar to #68 |
@Cynric92 doesn't seem like they have implemented it yet see the code here: It may be possible to fix their limited demo code to get it working with a local OpenAI APIcompatible endpoint such as I'll try to make a small patch to the extension, compile it, run it with normal VSCode, and see if I can get it to do anything. Right now it is basically broken to me as a local LLM user afaict. |
I got it working locally, here is the basic idea: # 1. checkout the project code
git clone git@github.com:voideditor/void.git
# 2. install deps
cd void/extensions/void
npm install
# 3. apply patches to get it to work with local LLM OpenAI endpoint e.g. `llama-server` or `aphrodite-engine`. just two files need a small patch to point it at local API endpoint:
diff --git a/extensions/void/src/SidebarWebviewProvider.ts b/extensions/void/src/SidebarWebviewProvider.ts
index fe01c80..e5aeabf 100644
--- a/extensions/void/src/SidebarWebviewProvider.ts
+++ b/extensions/void/src/SidebarWebviewProvider.ts
@@ -55,7 +55,7 @@ export class SidebarWebviewProvider implements vscode.WebviewViewProvider {
const nonce = getNonce(); // only scripts with the nonce are allowed to run, this is a recommended security measure
- const allowed_urls = ['https://api.anthropic.com', 'https://api.openai.com', 'https://api.greptile.com']
+ const allowed_urls = ['http://127.0.0.1:8080', 'https://api.anthropic.com', 'https://api.openai.com', 'https://api.greptile.com']
webview.html = `<!DOCTYPE html>
<html lang="en">
<head>
diff --git a/extensions/void/src/common/sendLLMMessage.ts b/extensions/void/src/common/sendLLMMessage.ts
index ca3c34b..57d83b1 100644
--- a/extensions/void/src/common/sendLLMMessage.ts
+++ b/extensions/void/src/common/sendLLMMessage.ts
@@ -108,10 +108,16 @@ const sendOpenAIMsg: SendLLMMessageFnTypeInternal = ({ messages, onText, onFinal
// if abort is called, onFinalMessage is NOT called, and no later onTexts are called either
let abort: () => void = () => { did_abort = true }
- const openai = new OpenAI({ apiKey: apiConfig.openai.apikey, dangerouslyAllowBrowser: true });
+ const openai = new OpenAI(
+ {
+ baseURL: 'http://127.0.0.1:8080/v1',
+ apiKey: apiConfig.openai.apikey,
+ dangerouslyAllowBrowser: true
+ });
openai.chat.completions.create({
- model: 'gpt-4o-2024-08-06',
+ //model: 'gpt-4o-2024-08-06',
+ model: 'Qwen/Qwen2.5-32B-Instruct-AWQ',
messages: messages,
stream: true,
})
diff --git a/scripts/code.sh b/scripts/code.sh
old mode 100644
new mode 100755
# 4. now build it
npm run build
# 5. now run it with normal vscode (i couldn't get their fork to build)
# pwd is `void/extensions/void`
code . Okay, now use normal VSCode to Now configure it by opening VSCode settings with +<,> and type in Finally, open the void chat on the left and type in something and make sure it generates.. Now that I have it working, I have no idea how it is "supposed" to work as tab completion seems to do nothing and + and + don't seem useful? https://voideditor.com/ Anyway, I'm more a a |
I don't know how to configure this place,I want to use my Ollama.

The text was updated successfully, but these errors were encountered: