Skip to content

Bug: Unable to build on Alpine Linux due to dependency on execinfo.h #8762

@acon96

Description

@acon96

What happened?

Attempting to build the library on Alpine Linux based systems (musl instead of glibc) fails due to the dependency on execinfo.h which is NOT implemented by musl.

This dependency should be properly detected in the CMake build scripts and the functionality should be disabled on Linux systems that do not support the backtrace functionality.

Name and Version

Dependency was re-introduced in 2b1f616 for all Linux builds.

What operating system are you seeing the problem on?

Linux

Relevant log output

[1/24] /usr/bin/gcc -DGGML_BUILD -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_LLAMAFILE -DGGML_USE_OPENMP -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dggml_EXPORTS -I/tmp/llama-cpp-python/vendor/llama.cpp/ggml/src/../include -I/tmp/llama-cpp-python/vendor/llama.cpp/ggml/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -fopenmp -MD -MT vendor/llama.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o -c /tmp/llama-cpp-python/vendor/llama.cpp/ggml/src/ggml.c
  FAILED: vendor/llama.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o 
  /usr/bin/gcc -DGGML_BUILD -DGGML_SCHED_MAX_COPIES=4 -DGGML_SHARED -DGGML_USE_LLAMAFILE -DGGML_USE_OPENMP -D_GNU_SOURCE -D_XOPEN_SOURCE=600 -Dggml_EXPORTS -I/tmp/llama-cpp-python/vendor/llama.cpp/ggml/src/../include -I/tmp/llama-cpp-python/vendor/llama.cpp/ggml/src/. -O3 -DNDEBUG -std=gnu11 -fPIC -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -Werror=implicit-int -Werror=implicit-function-declaration -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wdouble-promotion -fopenmp -MD -MT vendor/llama.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o -MF vendor/llama.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o.d -o vendor/llama.cpp/ggml/src/CMakeFiles/ggml.dir/ggml.c.o -c /tmp/llama-cpp-python/vendor/llama.cpp/ggml/src/ggml.c
  /tmp/llama-cpp-python/vendor/llama.cpp/ggml/src/ggml.c:145:10: fatal error: execinfo.h: No such file or directory
    145 | #include <execinfo.h>
        |          ^~~~~~~~~~~~
  compilation terminated.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug-unconfirmedmedium severityUsed to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions