-
-
Notifications
You must be signed in to change notification settings - Fork 410
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Errors out with "Protocol "" is unknown" when using the AI feature from an ollama instance #3052
Comments
update: I tried with llama3, the same error occurs T_T |
Hm, what you did looks ok. Can you please post the output from the Debug settings in the settings dialog? You just need to paste it here. |
Hi, so here's the debug info. I just realised that some of the words in the debug info are in Korean (because my laptop is set to korean). This is mostly just dates or things like "default", but I have no idea how to make it not do this, but it shouldn't really be an issue. So I've enabled debug logging and this is the useful output (I cleared it right before): At the very start, this is the log of the scripts being loaded:
After this, it just lists my file names and does more And this is where the keyboard trigger is registered:
This doesn't seem all that useful because it doesn't seem to say what exactly it means by protocol being unknown... Here's my debug info (I have no idea how much of it is actually relevant so here's the whole thing):
|
I've never tested what happens on macosx, Windows and Linux seem to work fine. 🤔
At least the request was made... Could you maybe try to test signing up for Groq (which is free, as in beer) and try a request there? There is a link in the AI Settings. |
Hmm, this is interesting. Groq seems to work, I think this is a problem with my ollama instance. I'll do some debugging and update you. Also, when I try it with ollama turned off, the debug logs are exactly the same. The request is made and nothing happens after that. |
Maybe your ollama instance is old? Openai api support was only added recently. |
I don't think that's the issue, I'm using version 0.1.48 which is the latest stable release from last week. May I know which version you're using to test this? Also, I did some further testing and here's what I found out:
What do you think about this? |
I was also (and am still) using 0.1.48. Did QOwnNotes detect what models you have installed? Because I implemented that ollama API call into the script (and that also works for me).
You are not supposed to add the path, it's just the baseUrl! for me that's |
The path is predefined by OpenAI... |
It's really strange. Do you have any other network errors? For example, if you insert a link with
Yes, you could try that, please. |
I came across this issue as well, I added It looks like the |
That's strange. Have you tried setting it manually? |
Yes, when I replace This was tried on Windows, previously it was on Linux(AppImage), the log shows the following:
The two blank lines after the functions are I have tried changing the values to plain text strings to see if that would change anything in the logs without success. |
Is the apiBaseUrl actually stored in your settings? I never saw a settings dump... And what operating system are you using? Can you please post the output from the debug settings that you can copy when you get to the settings dialog in QOwnNotes and head over to the Debug section of it? |
Yes, the settings are stored. I'll post a dump if this last bit of information doesn't help. For some reason the Ollama script uses |
The QOwnNotes/src/services/scriptingservice.cpp Lines 139 to 148 in de1cfcb
|
QOwnNotes Debug InformationGeneral InfoCurrent Date: Server InfoserverUrl: empty SpellcheckingEnabled: Note folderscurrentNoteFolderId: Note folder
|
That would be my bad, then! I just looked at other scripts, and they also use no |
I published a version 0.1.3! |
But I have no idea why it even worked out for me and others. 🤔 |
Looks good, thank you! Yes, it's very odd that it was working for some and not others. |
Signed-off-by: Patrizio Bekerle <patrizio@bekerle.com>
Thank you for pointing the |
Signed-off-by: Patrizio Bekerle <patrizio@bekerle.com>
Whenever I try to use the AI feature, it just responds with
Protocol "" is unknown
I've tried this with both Llama3 and dolphin-phi (which works better on my craptop), and it errors out either way.
Here's a video of what I mean:
2024-07-06.12.42.30.mov
I've installed
Ollama AI backend integration
,AI text tool
andAI autocompletion tool
. These are the settings that I've put in from the scripting panel:I have an instance of ollama running with
ollama serve
and it can respond to queries that I send it via curl, which means that this shouldn't be an issue on ollama's side. For example, I can ask it why the sky is blue and it will respond:The text was updated successfully, but these errors were encountered: