Skip to content

Conversation

@SomeoneSerge
Copy link
Collaborator

@SomeoneSerge SomeoneSerge commented Jan 21, 2024

Exposes a few attributes demonstrating how to build singularity/apptainer and docker images re-using llama.cpp's Nix expression

(cherry-picks from #4917, needs rebasing later)

@SomeoneSerge SomeoneSerge added the nix Issues specific to consuming flake.nix, or generally concerned with ❄ Nix-based llama.cpp deployment label Jan 21, 2024
@ggerganov
Copy link
Member

@SomeoneSerge Is this PR still relevant, or we can close it?

@SomeoneSerge
Copy link
Collaborator Author

SomeoneSerge commented Feb 19, 2024

@ggerganov sorry I got distracted and forgot to finish this. I'd still like to merge this, I just need to allocate an evening for polishing

EDIT: I skimmed through the changes and I can't recall what was it I wanted to add before merging
EDIT2: I'll rebase and push-force to trigger the CI

@SomeoneSerge SomeoneSerge marked this pull request as ready for review February 22, 2024 16:36
Copy link
Collaborator

@philiptaron philiptaron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Built locally on x86_64-linux with nix build github:someoneserge/llama.cpp/feat/nix/images#llamaPackages.{docker,docker-min,sif,llama-cpp} and it's fast and effective.

@philiptaron
Copy link
Collaborator

I intend to merge when CI completes. @ggerganov, perhaps you could turn on the ability to automerge after CI in the repo settings? Maybe the repo rules aren't set up to make that effective, though, since it says I can merge right now, so CI might not be a blocker.

@ggerganov
Copy link
Member

Yes, merging can be done anytime if at least 1 approval is provided. But for now I'll keep the auto-merge disabled

@philiptaron philiptaron merged commit 201294a into ggml-org:master Feb 22, 2024
jordankanter pushed a commit to jordankanter/llama.cpp that referenced this pull request Mar 13, 2024
Exposes a few attributes demonstrating how to build [singularity](https://docs.sylabs.io/guides/latest/user-guide/)/[apptainer](https://apptainer.org/) and Docker images re-using llama.cpp's Nix expression.

Built locally on `x86_64-linux` with `nix build github:someoneserge/llama.cpp/feat/nix/images#llamaPackages.{docker,docker-min,sif,llama-cpp}` and it's fast and effective.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

nix Issues specific to consuming flake.nix, or generally concerned with ❄ Nix-based llama.cpp deployment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants