-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ERROR] Trilium requires javascript to be enabled when trying to use this plugin #21
Comments
@Greatz08 , the "Trilium requires JavaScript to be enabled" is not actually an error, it's just part of the Trilium HTML document structure. What you are looking for is the "Console" tab (see the tabs on the right side of the screen with "Elements", "Console", "Sources", etc.). There you should find the errors. |
@eliandoran thanks for responding back regarding my issue. I know its not typical error still it is kind of issue i am getting so thought to address it as error. I dont know why its giving me this issue since i have fresh installed it using docker compose method. This is my docker compose file:
As you tested with chatgpt api key and it worked for you so maybe is it ollama related issue ? Can you please run any small ollama model like phi3 or any other quantized version if you dont have gpu or low power gpu like 1b or 3b models will for sure run even if you might not have powerful gpu and then please try to connect and tell if it is working or not for you. Because i tried with ollama using - localhost:11434 path and tried with 127.0.0.1 too so yeah please try and tell if it works for you or not so that we can conclude if this issue is for me only (which shouldnt happen technically because i am using same docker image as you so it must be either configuration issue or this program might not be working for recent ollama version or something else which we need to understand and try to fix it. |
@Greatz08 , in order to understand the issue it would be important to see the list of errors. A screenshot of the "Console" tab, as mentioned previously, would be helpful. |
@eliandoran My bad i was confused and in experimentations without restarting i was getting that errors btw if anyone else getting that error again you can tell them to use this as completion endpoint : "completion": "http://localhost:11434/api/chat" by default it is set to https://ollama.internal.network/api/chat which obviously wont work so what i do generally to connect with openwebui and other chat apps is to directly give them - http://localhost:11434/ this as endpoint and they know how to use api but here we had to specify exact path to make it work so that was becoming the cause of issue and i personally tested with this too but since i did tried so many wrong combos and so i forgot to restart and check back at that time so i was quite pissed off so thought to ask regarding this . Anyways Thankyou very much for making me realize my mistake. |
@eliandoran Sorry to disturb you again :-) New issue here attaching console error (Getting this whenever i ask anything tried to repeat multiple times with application restart,server restart,browser restart) model is loaded perfectly so no issues from that side and will work nomally if i chat in cli but getting that console error everytime i message anything be it related to notes or anything else so i dont know what can be the cause of it but if possible try to look into this issue and help me if possible :-) |
@soulsands
Please guide what can be the cause of it and how can we fix it. I am currently using latest triliumnext latest version and this was my first time experience of using plugin in trilium.
The text was updated successfully, but these errors were encountered: