-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the word "ollama" should be removed from the README.md #6
Comments
You have to setup the api for ollama, I used lmstudio and textgen webui |
In model change the model to the exact one in ollama and remove any data params that are not relevant to your model. |
I have set the ollama api and indicate the llama3 model (already in my ollama) so that I can get response from ollama. I also need to comment out several items in the 'data' object like this:
but the response is meaningless. |
What do you mean by meaningless? |
I use the default llama3 model pulled using ollama command. From the file size, I "guess" it is q4. I got message like this. I use ctrl+c to stop it because it repeatedly produce similar message
|
Did it send you the message on telegram though? |
Even with @ketsapiwiq's modification #2, I still need to comment out several items in the 'data' object within the llm_request function in order to receive a response from Ollama. However, the response is clearly useless. My suggestion would be to remove the word 'Ollama' from the README.md, as the project has not been able to utilize it as the LLM backend.
The text was updated successfully, but these errors were encountered: