-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model runs but doesn't produce any output #204
Comments
My computer won't get stuck. Its running time is as follows:
|
Thank you for using llama.cpp and thank you for bringing this to our attention. If the program isn't doing anything for you, then there's not a whole lot for us to go on right now, since I'm not able to reproduce this issue. In order for us to fix the issue you've encountered, you need to debug things just enough so that you're able to tell us what we need to do. For example, you could run the program inside GDB to provide us with a backtrace of where it's getting stuck. That could hopefully provide some clue. |
It is hard to see from your screenshots. Properly formatted markdown of your complete run would make it easier to see what you're doing. It does however look like you are running in interactive mode? |
I was just having this issue, what resolved it was pulling latest to get a ggml.c change & re-running make |
thanks it worked for me, even with old models, I don't know what went wrong the first time, but now everything is super |
I checked everything several times and quantized it, but both models do not output anything, in which mode I would not run them, the processor loads, but there is no output, no matter how long I wait
input to the console also does not lead to anything
for ubuntu 22.04 8gb+15 swap (everything fits)
The text was updated successfully, but these errors were encountered: