Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use this Api -- Ollama #74

Closed
Cynric92 opened this issue Sep 27, 2024 · 3 comments
Closed

How to use this Api -- Ollama #74

Cynric92 opened this issue Sep 27, 2024 · 3 comments

Comments

@Cynric92
Copy link

I don't know how to configure this place,I want to use my Ollama.
image

@ubergarm
Copy link

Seems similar to #68

@ubergarm
Copy link

ubergarm commented Sep 27, 2024

@Cynric92 doesn't seem like they have implemented it yet see the code here:

https://github.com/voideditor/void/blob/main/extensions/void/src/common/sendLLMMessage.ts#L22-L24

It may be possible to fix their limited demo code to get it working with a local OpenAI APIcompatible endpoint such as llama-server or aphrodite-engine.

I'll try to make a small patch to the extension, compile it, run it with normal VSCode, and see if I can get it to do anything. Right now it is basically broken to me as a local LLM user afaict.

@ubergarm
Copy link

I got it working locally, here is the basic idea:

# 1. checkout the project code
git clone git@github.com:voideditor/void.git

# 2. install deps
cd void/extensions/void
npm install

# 3. apply patches to get it to work with local LLM OpenAI endpoint e.g. `llama-server` or `aphrodite-engine`. just two files need a small patch to point it at local API endpoint:

diff --git a/extensions/void/src/SidebarWebviewProvider.ts b/extensions/void/src/SidebarWebviewProvider.ts
index fe01c80..e5aeabf 100644
--- a/extensions/void/src/SidebarWebviewProvider.ts
+++ b/extensions/void/src/SidebarWebviewProvider.ts
@@ -55,7 +55,7 @@ export class SidebarWebviewProvider implements vscode.WebviewViewProvider {
 		const nonce = getNonce(); // only scripts with the nonce are allowed to run, this is a recommended security measure
 
 
-		const allowed_urls = ['https://api.anthropic.com', 'https://api.openai.com', 'https://api.greptile.com']
+		const allowed_urls = ['http://127.0.0.1:8080', 'https://api.anthropic.com', 'https://api.openai.com', 'https://api.greptile.com']
 		webview.html = `<!DOCTYPE html>
       <html lang="en">
       <head>
diff --git a/extensions/void/src/common/sendLLMMessage.ts b/extensions/void/src/common/sendLLMMessage.ts
index ca3c34b..57d83b1 100644
--- a/extensions/void/src/common/sendLLMMessage.ts
+++ b/extensions/void/src/common/sendLLMMessage.ts
@@ -108,10 +108,16 @@ const sendOpenAIMsg: SendLLMMessageFnTypeInternal = ({ messages, onText, onFinal
 	// if abort is called, onFinalMessage is NOT called, and no later onTexts are called either
 	let abort: () => void = () => { did_abort = true }
 
-	const openai = new OpenAI({ apiKey: apiConfig.openai.apikey, dangerouslyAllowBrowser: true });
+	const openai = new OpenAI(
+		{
+			baseURL: 'http://127.0.0.1:8080/v1',
+			apiKey: apiConfig.openai.apikey,
+			dangerouslyAllowBrowser: true
+		});
 
 	openai.chat.completions.create({
-		model: 'gpt-4o-2024-08-06',
+		//model: 'gpt-4o-2024-08-06',
+		model: 'Qwen/Qwen2.5-32B-Instruct-AWQ',
 		messages: messages,
 		stream: true,
 	})
diff --git a/scripts/code.sh b/scripts/code.sh
old mode 100644
new mode 100755

# 4. now build it
npm run build

# 5. now run it with normal vscode (i couldn't get their fork to build)
# pwd is `void/extensions/void`
code .

Okay, now use normal VSCode to run the extension and it will open up a new window that looks like VSCode but have a little robot chat icon on the left side. You can close the original VSCode window now and leave the new one with void extension running.

Now configure it by opening VSCode settings with +<,> and type in void. Make it look like this to match your llama-server or aphrodite-engine settings:

void-working-with-local-llm

Finally, open the void chat on the left and type in something and make sure it generates..

Now that I have it working, I have no idea how it is "supposed" to work as tab completion seems to do nothing and + and + don't seem useful? https://voideditor.com/

Anyway, I'm more a a vim plus LSPs person... hah... Have fun!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants