We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discoveries making docker work with AMD GPU
Prerequisites: You need amdgpu-dkms on host. (https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/quick-start.html#rocm-install-quick)
Add this to docker-compose.yaml(https://rocm.docs.amd.com/projects/install-on-linux/en/latest/how-to/docker.html#docker-compose)
devices: - /dev/kfd:/dev/kfd - /dev/dri:/dev/dri environment: - PATH=/opt/rocm/bin:/opt/rocm/hip/bin:/opt/rocm/opencl/bin:/opt/rocm/hsa/bin:/opt/rocm/llvm/bin:$PATH - HIP_PATH=/opt/rocm/ - HIP_CLANG_PATH=/opt/rocm/llvm/bin - HIP_DEVICE_LIB_PATH=/opt/rocm/amdgcn/bitcode
run: sudo docker compose up -d and sudo docker exec -it pixelarch_os-pixelarch-os-1 /bin/bash
sudo docker compose up -d
sudo docker exec -it pixelarch_os-pixelarch-os-1 /bin/bash
Installing amd stuff: yay -Syu rocm-hip-sdk rocminfo
yay -Syu rocm-hip-sdk rocminfo
Note: Running things on GPU requires sudo (ROCm/ROCm-docker#90 (comment)) Optional compile llama.cpp:
git clone https://github.com/ggml-org/llama.cpp cd llama.cpp HIPCXX="$(hipconfig -l)/clang" HIP_PATH="$(hipconfig -R)" \ cmake -S . -B build -DGGML_HIP=ON -DAMDGPU_TARGETS=gfx1100 -DCMAKE_BUILD_TYPE=Release \ && cmake --build build --config Release -- -j 16
The text was updated successfully, but these errors were encountered:
┌─[midori-ai][b15ade670966][~] └─➞ ls -l /dev/kfd /dev/dri crw-rw---- 1 root 44 235, 0 Feb 24 20:44 /dev/kfd
/dev/dri: total 0 crw-rw---- 1 root 44 226, 1 Feb 24 20:44 card1 crw-rw---- 1 root 110 226, 128 Feb 24 20:44 renderD128
Sorry, something went wrong.
@Notnaton cleaned up your post for you
lunamidori5
No branches or pull requests
Discoveries making docker work with AMD GPU
Prerequisites:
You need amdgpu-dkms on host. (https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/quick-start.html#rocm-install-quick)
Add this to docker-compose.yaml(https://rocm.docs.amd.com/projects/install-on-linux/en/latest/how-to/docker.html#docker-compose)
run:
sudo docker compose up -d
and
sudo docker exec -it pixelarch_os-pixelarch-os-1 /bin/bash
Installing amd stuff:
yay -Syu rocm-hip-sdk rocminfo
Note:
Running things on GPU requires sudo (ROCm/ROCm-docker#90 (comment))
Optional compile llama.cpp:
The text was updated successfully, but these errors were encountered: