Skip to content

cannot execute file: binary Format error for exec() #1180

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
anonymouso5310 opened this issue Apr 25, 2023 · 3 comments
Closed

cannot execute file: binary Format error for exec() #1180

anonymouso5310 opened this issue Apr 25, 2023 · 3 comments

Comments

@anonymouso5310
Copy link

anonymouso5310 commented Apr 25, 2023

Hi i have an error when i try to install LLama
when i have to quantize the model to 4-bits
i got this error : Permission not granted
so i've allowed it to be execute and now i have this error:
cannot execute file: binary Format error for exec()

pls help

@SlyEcho
Copy link
Collaborator

SlyEcho commented Apr 25, 2023

How did you compile llama.cpp? What compiler? Did you get any errors?

@anonymouso5310
Copy link
Author

i've used cmake and there is no errors

@SlyEcho
Copy link
Collaborator

SlyEcho commented Apr 25, 2023

What was the command that you used to quantize?

It should be ./build/bin/quantize ./models/ggml-model-f16.bin ./models/ggml-model-q4_0.bin 2 or similar assuming you are in the llama.cpp root and that your CMake build dir is build

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants