Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Completion on available Ollama models support for quarkus.langchain4j.ollama.chat-model.model-id property value #985

Merged
merged 1 commit into from
Sep 13, 2024

Conversation

angelozerr
Copy link
Contributor

@angelozerr angelozerr commented Aug 20, 2024

feat: Available Ollama models support for quarkus.langchain4j.ollama.chat-model.model-id property value

This PR requires eclipse/lsp4mp#460

Those models are retrieved by consuming the Ollama API ${base-url}/v1/models where base-url equals to:

  • the value coming from declared quarkus.langchain4j.ollama.base-url property in the application.properties.
  • http://localhost:11434 otherwise

Here the result in vscode with completion:

image

Here the result with validtaion:

image

As the error message is generic, it is not very good, LSP4MP must be improved to contribute with custom error validation message.

The Ollama models are cached, so if you add a new model in Ollama, you need to close the file and reopen it to refresh the cache (not tested but I think it should work).

@datho7561
Copy link
Contributor

If ollama is not running, then the list of models cannot be collected. There is an error logged to the lsp4mp output. I think it's okay to not list the models, but I think we should prevent the error from being logged:

SEVERE: Error while collecting Ollama Models with 'http://localhost:11434/v1/models'.
java.net.ConnectException
	at java.net.http/jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:956)
	at java.net.http/jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:133)
	at com.redhat.quarkus.extensions.ollama.OllamaItemMetadataProvider.collectOllamaModels(OllamaItemMetadataProvider.java:129)
	at com.redhat.quarkus.extensions.ollama.OllamaItemMetadataProvider.update(OllamaItemMetadataProvider.java:105)
	at org.eclipse.lsp4mp.extensions.ExtendedMicroProfileProjectInfo.synchUpdateCustomProperties(ExtendedMicroProfileProjectInfo.java:210)
	at org.eclipse.lsp4mp.extensions.ExtendedMicroProfileProjectInfo.updateCustomProperties(ExtendedMicroProfileProjectInfo.java:195)
	at org.eclipse.lsp4mp.services.properties.PropertiesFileLanguageService.updateProperties(PropertiesFileLanguageService.java:271)
	at org.eclipse.lsp4mp.services.properties.PropertiesFileLanguageService.doDiagnostics(PropertiesFileLanguageService.java:227)
	at org.eclipse.lsp4mp.ls.properties.PropertiesFileTextDocumentService.triggerValidationFor(PropertiesFileTextDocumentService.java:347)
	at org.eclipse.lsp4mp.ls.properties.PropertiesFileTextDocumentService.lambda$triggerValidationFor$17(PropertiesFileTextDocumentService.java:332)
	at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1194)
	at java.base/java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:527)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:507)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1458)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:2034)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:189)
Caused by: java.net.ConnectException
	at java.net.http/jdk.internal.net.http.common.Utils.toConnectException(Utils.java:1069)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:227)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.checkRetryConnect(PlainHttpConnection.java:280)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$2(PlainHttpConnection.java:238)
	at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:978)
	at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:955)
	at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:554)
	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1817)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1575)
Caused by: java.nio.channels.ClosedChannelException
	at java.base/sun.nio.ch.SocketChannelImpl.ensureOpen(SocketChannelImpl.java:204)
	at java.base/sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:958)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$1(PlainHttpConnection.java:210)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:571)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:212)
	... 9 more

@datho7561
Copy link
Contributor

Gotta update my ollama; I'm on 0.1.20 and it looks like the /v1/models endpoint doesn't exist in that version.

@angelozerr
Copy link
Contributor Author

If ollama is not running, then the list of models cannot be collected. There is an error logged to the lsp4mp output. I think it's okay to not list the models, but I think we should prevent the error from being logged:

SEVERE: Error while collecting Ollama Models with 'http://localhost:11434/v1/models'.
java.net.ConnectException
	at java.net.http/jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:956)
	at java.net.http/jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:133)
	at com.redhat.quarkus.extensions.ollama.OllamaItemMetadataProvider.collectOllamaModels(OllamaItemMetadataProvider.java:129)
	at com.redhat.quarkus.extensions.ollama.OllamaItemMetadataProvider.update(OllamaItemMetadataProvider.java:105)
	at org.eclipse.lsp4mp.extensions.ExtendedMicroProfileProjectInfo.synchUpdateCustomProperties(ExtendedMicroProfileProjectInfo.java:210)
	at org.eclipse.lsp4mp.extensions.ExtendedMicroProfileProjectInfo.updateCustomProperties(ExtendedMicroProfileProjectInfo.java:195)
	at org.eclipse.lsp4mp.services.properties.PropertiesFileLanguageService.updateProperties(PropertiesFileLanguageService.java:271)
	at org.eclipse.lsp4mp.services.properties.PropertiesFileLanguageService.doDiagnostics(PropertiesFileLanguageService.java:227)
	at org.eclipse.lsp4mp.ls.properties.PropertiesFileTextDocumentService.triggerValidationFor(PropertiesFileTextDocumentService.java:347)
	at org.eclipse.lsp4mp.ls.properties.PropertiesFileTextDocumentService.lambda$triggerValidationFor$17(PropertiesFileTextDocumentService.java:332)
	at java.base/java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:1194)
	at java.base/java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:527)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:507)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1458)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:2034)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:189)
Caused by: java.net.ConnectException
	at java.net.http/jdk.internal.net.http.common.Utils.toConnectException(Utils.java:1069)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:227)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.checkRetryConnect(PlainHttpConnection.java:280)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$2(PlainHttpConnection.java:238)
	at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:978)
	at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:955)
	at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:554)
	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1817)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1575)
Caused by: java.nio.channels.ClosedChannelException
	at java.base/sun.nio.ch.SocketChannelImpl.ensureOpen(SocketChannelImpl.java:204)
	at java.base/sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:958)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$1(PlainHttpConnection.java:210)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:571)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:212)
	... 9 more

I did that to understand some trouble with ollama. Perhaps a better thing to do is to report this error as LSP diagnostic to notify to the user that it cannot connect to Ollama? @fbricon @datho7561 what do you think about this idea?

@angelozerr
Copy link
Contributor Author

Gotta update my ollama; I'm on 0.1.20 and it looks like the /v1/models endpoint doesn't exist in that version.

That's too bad -( Perhaps we should use api instead https://github.com/ollama/ollama/blob/main/docs/api.md ? @fbricon what do you think about that?

@fbricon
Copy link
Collaborator

fbricon commented Sep 10, 2024

Gotta update my ollama; I'm on 0.1.20 and it looks like the /v1/models endpoint doesn't exist in that version.

That's too bad -( Perhaps we should use api instead https://github.com/ollama/ollama/blob/main/docs/api.md ? @fbricon what do you think about that?

Yeah /api/tags will cover more ollama versions.
/v1/models is a recent addition for Open AI compat'

@fbricon
Copy link
Collaborator

fbricon commented Sep 10, 2024

I did that to understand some trouble with ollama. Perhaps a better thing to do is to report this error as LSP diagnostic to notify to the user that it cannot connect to Ollama? @fbricon @datho7561 what do you think about this idea?

I'd rather avoid adding annoying diags. I'd say we ignore this problem for the time being. We might revisit this decision over time, depending on user feedback.

@datho7561
Copy link
Contributor

I updated to the latest ollama. The PR works very well! The default ollama installation method on Linux sets ollama up as a background service. As a result, the user is unlikely to run into the error I ran into.

(oops meant to sent this a while ago)

@angelozerr
Copy link
Contributor Author

angelozerr commented Sep 10, 2024

Gotta update my ollama; I'm on 0.1.20 and it looks like the /v1/models endpoint doesn't exist in that version.

That's too bad -( Perhaps we should use api instead https://github.com/ollama/ollama/blob/main/docs/api.md ? @fbricon what do you think about that?

Yeah /api/tags will cover more ollama versions. /v1/models is a recent addition for Open AI compat'

It means that I need to switch to /api/tags, right?

@angelozerr
Copy link
Contributor Author

I did that to understand some trouble with ollama. Perhaps a better thing to do is to report this error as LSP diagnostic to notify to the user that it cannot connect to Ollama? @fbricon @datho7561 what do you think about this idea?

I'd rather avoid adding annoying diags. I'd say we ignore this problem for the time being. We might revisit this decision over time, depending on user feedback.

Ok let's give up diags, but should hide the error from stacktrace like @datho7561 reported?

@angelozerr
Copy link
Contributor Author

I updated to the latest ollama. The PR works very well!

Thanks!

The default ollama installation method on Linux sets ollama up as a background service. As a result, the user is unlikely to run into the error I ran into.

You mean using v1 instead of api/tags?

(oops meant to sent this a while ago)

@datho7561
Copy link
Contributor

You mean using v1 instead of api/tags?

No, I just needed to reinstall ollama.

In order to install ollama, I ran the script curl -fsSL https://ollama.com/install.sh | sh. This sets ollama up so that the background job that ollama needs is automatically started up when your computer starts.

The first time I installed ollama I didn't follow their instructions, I just downloaded the binary.

@angelozerr
Copy link
Contributor Author

@datho7561 I updated my PR:

  • now when there is an error, I display in the log the error with the stacktrace to have this kind of error:
    SEVERE: Could not connect to ollama 'http://localhost:8080/api/tags': null

Here null, is the e.getMessage()

  • I'm using /api/tags so it should work with any ollama version

@fbricon
Copy link
Collaborator

fbricon commented Sep 11, 2024

please add missing copyright headers

@angelozerr
Copy link
Contributor Author

@datho7561 I did several refactoring, please retry it.

@angelozerr
Copy link
Contributor Author

please add missing copyright headers

fixed

quarkus.langchain4j.ollama.chat-model.model-id property value

Signed-off-by: azerr <azerr@redhat.com>
Copy link
Contributor

@datho7561 datho7561 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works well for me, and the code looks good. Thanks, Angelo!

@datho7561
Copy link
Contributor

test failures seem to be because lsp4mp hasn't built a snapshot yet

@angelozerr angelozerr merged commit bd268fe into redhat-developer:master Sep 13, 2024
1 check passed
@angelozerr
Copy link
Contributor Author

Thanks so much @fbricon @datho7561 for your great review!

@angelozerr angelozerr changed the title feat: Available Ollama models support for quarkus.langchain4j.ollama.chat-model.model-id property value Completion on available Ollama models support for quarkus.langchain4j.ollama.chat-model.model-id property value Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: ✅ Done
Development

Successfully merging this pull request may close these issues.

3 participants