Skip to content

Feature Request: Ship llama.cpp binaries in AppImage format #11579

@rgerganov

Description

@rgerganov

Prerequisites

  • I am running the latest code. Mention the version if possible as well.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new and useful enhancement to share.

Feature Description

Shipping prebuilt binaries for Linux which "just work" is non-trivial task and using the AppImage format seems to be the best shot for this.

I don't know how feasible is to package a llama.cpp binary with all of the available backends but it should be doable for a single backend. For example llama-server-vulkan-x86_64.AppImage would package all of the dependencies needed for running llama-server with Vulkan backend. The user downloads this file and it "just works".

I think this can be combined with other solutions for Windows and MacOS into a single landing page where the user selects an OS, an application (e.g. llama-cli, llama-server, etc) and a backend and gets a downloadable file which is ready to run.

Thoughts?

Motivation

There should be an easy way to download and run llama.cpp binaries on various Linux distributions.

Possible Implementation

Ship llama.cpp binaries along with dependencies in AppImage format

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions