We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LocalAI version: v2.19.2 and v2.19.3 (v2.19.1 is fine)
Environment, CPU architecture, OS, and Version: Ubuntu 22.04, amd64
Describe the bug Error below:
DBG GRPC(Meta-Llama-3-8B-Instruct.Q4_K_M.gguf-127.0.0.1:44439): stderr /tmp/localai/backend_data/backend-assets/grpc/llama-cpp-avx2: error while loading shared libraries: /tmp/localai/backend_data/backend-assets/grpc/llama-cpp-avx2: object file has no dynamic section
To Reproduce ./local-ai --debug --config-file test.yaml curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{ "model": "llama-3-8b-instruct", "messages": [{"role": "user", "content": "hello"}] }'
./local-ai --debug --config-file test.yaml
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{ "model": "llama-3-8b-instruct", "messages": [{"role": "user", "content": "hello"}] }'
- name: llama-3-8b-instruct backend: llama parameters: model: Meta-Llama-3-8B-Instruct.Q4_K_M.gguf context_size: 8192 # cut for brevity
Expected behavior Should work as expected, like in v2.19.1
Logs see above
Additional context
The text was updated successfully, but these errors were encountered:
mh. that sounds weird. Is that the binary only? Might be glibc - or either probably caused by: #2947
Sorry, something went wrong.
probably related: https://bugzilla.redhat.com/show_bug.cgi?id=1381315
fix(llama-cpp): do not compress with UPX
27acb81
Fixes: #3041 Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
fix(llama-cpp): do not compress with UPX (#3084)
274487c
Successfully merging a pull request may close this issue.
LocalAI version:
v2.19.2 and v2.19.3 (v2.19.1 is fine)
Environment, CPU architecture, OS, and Version:
Ubuntu 22.04, amd64
Describe the bug
Error below:
DBG GRPC(Meta-Llama-3-8B-Instruct.Q4_K_M.gguf-127.0.0.1:44439): stderr /tmp/localai/backend_data/backend-assets/grpc/llama-cpp-avx2: error while loading shared libraries: /tmp/localai/backend_data/backend-assets/grpc/llama-cpp-avx2: object file has no dynamic section
To Reproduce
./local-ai --debug --config-file test.yaml
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{ "model": "llama-3-8b-instruct", "messages": [{"role": "user", "content": "hello"}] }'
Expected behavior
Should work as expected, like in v2.19.1
Logs
see above
Additional context
The text was updated successfully, but these errors were encountered: