Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
9144c32
Add build arg for ROCm version
sredman Sep 23, 2025
9849512
Break ROCM_VERSION into ROCM_{MAJOR,MINOR}_VERSION
sredman Sep 24, 2025
22352b4
Use correct ROCm package names
sredman Sep 24, 2025
edd8c55
Add rocm package for runtime libs
sredman Sep 25, 2025
e709a73
Remove hipblas-dev and rocblas-dev. I think they are not needed, and …
sredman Sep 25, 2025
ab3bc75
Migrate ROCM build flags to llama-cpp docker file
sredman Dec 16, 2025
216ae36
Change base-level Dockerfile back to same runtime dependency packages…
sredman Dec 18, 2025
a4ef661
Use default ROCm version of 5.5.1 to match Ubuntu package
sredman Dec 18, 2025
54c5bb8
Remove ROCm runtime package from llamacpp dockerfile
sredman Dec 18, 2025
9a11a07
Annotate ROCm runtime dependencies
sredman Dec 18, 2025
7d95eab
Remove comments in runtime dependencies install step
sredman Dec 19, 2025
517c7e6
Change default ROCM version to 6.4.3
sredman Dec 20, 2025
54ed3e6
Rename ROCm6 build in image CI output to -gpu-amd-rocm-6
sredman Dec 20, 2025
20117c4
Add ROCm 7 image builds
sredman Dec 20, 2025
6e7fd30
Add rocm-*-version arguments to backend.yml header
sredman Dec 21, 2025
94a1620
Change 'amd' capability to 'amd-rocm-'
sredman Dec 21, 2025
88a3ced
Translate all backend index.yaml entries to use 'amd-rocm-6' instead …
sredman Dec 21, 2025
9297568
Add backend/index.yaml entries for llama-cpp on rocm7
sredman Dec 21, 2025
1a1af07
Update docker tags for previously mis-named backends
sredman Dec 21, 2025
7ea4b73
Bulk update documentation with new image tag names
sredman Dec 21, 2025
2043272
Add rocm version inputs to backend_build.yml
sredman Jan 11, 2026
4aad63f
Align the default version in backend_build with the rest of the universe
sredman Jan 11, 2026
63449eb
Fix merge conflict'd base-image
sredman Jan 11, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 32 additions & 17 deletions .github/workflows/backend.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@ jobs:
build-type: ${{ matrix.build-type }}
cuda-major-version: ${{ matrix.cuda-major-version }}
cuda-minor-version: ${{ matrix.cuda-minor-version }}
rocm-major-version: ${{ matrix.rocm-major-version }}
rocm-minor-version: ${{ matrix.rocm-minor-version }}
platforms: ${{ matrix.platforms }}
runs-on: ${{ matrix.runs-on }}
base-image: ${{ matrix.base-image }}
Expand Down Expand Up @@ -554,7 +556,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-rerankers'
tag-suffix: '-gpu-amd-rocm-6-rerankers'
runs-on: 'ubuntu-latest'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -563,24 +565,37 @@ jobs:
context: "./"
ubuntu-version: '2404'
- build-type: 'hipblas'
cuda-major-version: ""
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure I understand this comment correctly. Do you mean simply add rocm-*-version around L15 of backend_build.yaml? I have done that. If you mean something else, please guide me 😄

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you did that change, It seems it's not been committed, in this changeset there are no changes to that file (backend_build.yml) that is basically needed otherwise the CI won't pick up the options defined here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now this has truly been done 😄

cuda-minor-version: ""
rocm-major-version: "6"
rocm-minor-version: "4.3"
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-llama-cpp'
tag-suffix: '-gpu-amd-rocm-6-llama-cpp'
runs-on: 'ubuntu-latest'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
base-image: "ubuntu:24.04"
skip-drivers: 'false'
backend: "llama-cpp"
dockerfile: "./backend/Dockerfile.llama-cpp"
context: "./"
ubuntu-version: '2404'
- build-type: 'hipblas'
rocm-major-version: "7"
rocm-minor-version: "1.1"
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-amd-rocm-7-llama-cpp'
runs-on: 'ubuntu-latest'
base-image: "ubuntu:22.04"
skip-drivers: 'false'
backend: "llama-cpp"
dockerfile: "./backend/Dockerfile.llama-cpp"
context: "./"
ubuntu-version: '2204'
- build-type: 'hipblas'
cuda-major-version: ""
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-vllm'
tag-suffix: '-gpu-amd-rocm-6-vllm'
runs-on: 'arc-runner-set'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -593,7 +608,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-transformers'
tag-suffix: '-gpu-amd-rocm-6-transformers'
runs-on: 'arc-runner-set'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -606,7 +621,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-diffusers'
tag-suffix: '-gpu-amd-rocm-6-diffusers'
runs-on: 'arc-runner-set'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -620,7 +635,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-kokoro'
tag-suffix: '-gpu-amd-rocm-6-kokoro'
runs-on: 'arc-runner-set'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -633,7 +648,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-vibevoice'
tag-suffix: '-gpu-amd-rocm-6-vibevoice'
runs-on: 'arc-runner-set'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -646,7 +661,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-faster-whisper'
tag-suffix: '-gpu-amd-rocm-6-faster-whisper'
runs-on: 'ubuntu-latest'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -659,7 +674,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-coqui'
tag-suffix: '-gpu-amd-rocm-6-coqui'
runs-on: 'ubuntu-latest'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand All @@ -672,7 +687,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-bark'
tag-suffix: '-gpu-amd-rocm-6-bark'
runs-on: 'arc-runner-set'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand Down Expand Up @@ -1055,7 +1070,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-whisper'
tag-suffix: '-gpu-amd-rocm-6-whisper'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
runs-on: 'ubuntu-latest'
skip-drivers: 'false'
Expand Down Expand Up @@ -1178,7 +1193,7 @@ jobs:
platforms: 'linux/amd64'
skip-drivers: 'true'
tag-latest: 'auto'
tag-suffix: '-gpu-hipblas-exllama2'
tag-suffix: '-gpu-amd-rocm-6-exllama2'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
runs-on: 'ubuntu-latest'
backend: "exllama2"
Expand All @@ -1204,7 +1219,7 @@ jobs:
# cuda-minor-version: ""
# platforms: 'linux/amd64'
# tag-latest: 'auto'
# tag-suffix: '-gpu-hipblas-rfdetr'
# tag-suffix: '-gpu-amd-rocm-6-rfdetr'
# base-image: "rocm/dev-ubuntu-24.04:6.4.4"
# runs-on: 'ubuntu-latest'
# skip-drivers: 'false'
Expand Down Expand Up @@ -1244,7 +1259,7 @@ jobs:
cuda-minor-version: ""
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-rocm-hipblas-neutts'
tag-suffix: '-gpu-amd-rocm-6-neutts'
runs-on: 'arc-runner-set'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
skip-drivers: 'false'
Expand Down
8 changes: 8 additions & 0 deletions .github/workflows/backend_build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,14 @@ on:
description: 'CUDA minor version'
default: "1"
type: string
rocm-major-version:
description: 'ROCm major version'
default: "6"
type: string
rocm-minor-version:
description: 'ROCm minor version'
default: "4.3"
type: string
platforms:
description: 'Platforms'
default: ''
Expand Down
28 changes: 20 additions & 8 deletions .github/workflows/image.yml
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
---
name: 'build container images'

on:
push:
branches:
- master
tags:
- '*'

concurrency:
group: ci-${{ github.head_ref || github.ref }}-${{ github.repository }}
cancel-in-progress: true

jobs:
hipblas-jobs:
uses: ./.github/workflows/image_build.yml
Expand All @@ -38,17 +38,30 @@
matrix:
include:
- build-type: 'hipblas'
rocm-major-version: "6"
rocm-minor-version: "4.3"
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-hipblas'
base-image: "rocm/dev-ubuntu-24.04:6.4.4"
tag-suffix: '-gpu-amd-rocm-6'
base-image: "ubuntu:24.04"
grpc-base-image: "ubuntu:24.04"
runs-on: 'ubuntu-latest'
makeflags: "--jobs=3 --output-sync=target"
aio: "-aio-gpu-hipblas"
ubuntu-version: '2404'
ubuntu-codename: 'noble'

- build-type: 'hipblas'
rocm-major-version: "7"
rocm-minor-version: "1.1"
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-amd-rocm-7'
base-image: "ubuntu:24.04"
runs-on: 'ubuntu-latest'
makeflags: "--jobs=3 --output-sync=target"
aio: "-aio-gpu-amd-rocm-7"
ubuntu-version: '2204'

core-image-build:
uses: ./.github/workflows/image_build.yml
with:
Expand Down Expand Up @@ -134,7 +147,7 @@
aio: "-aio-gpu-intel"
ubuntu-version: '2404'
ubuntu-codename: 'noble'

gh-runner:
uses: ./.github/workflows/image_build.yml
with:
Expand Down Expand Up @@ -184,4 +197,3 @@
skip-drivers: 'false'
ubuntu-version: '2404'
ubuntu-codename: 'noble'

24 changes: 21 additions & 3 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ FROM requirements AS requirements-drivers
ARG BUILD_TYPE
ARG CUDA_MAJOR_VERSION=12
ARG CUDA_MINOR_VERSION=0
ARG ROCM_MAJOR_VERSION=6
ARG ROCM_MINOR_VERSION=4.3 # ROCm version to append to the major version, in the format of their apt repo (https://repo.radeon.com/rocm/apt/). Like `0_alpha` or `3.4`.
ARG SKIP_DRIVERS=false
ARG TARGETARCH
ARG TARGETVARIANT
Expand Down Expand Up @@ -146,13 +148,29 @@ RUN if [ "${BUILD_TYPE}" = "clblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then \
; fi

RUN if [ "${BUILD_TYPE}" = "hipblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then \
# Setup for specific ROCm version as described here: https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/install-methods/package-manager/package-manager-ubuntu.html
ROCM_VERSION="${ROCM_MAJOR_VERSION}.${ROCM_MINOR_VERSION}" && \
apt-get update && \
apt-get install -y --no-install-recommends \
hipblas-dev \
rocblas-dev && \
gpg wget && \
mkdir --parents --mode=0755 /etc/apt/keyrings && \
wget -qO - https://repo.radeon.com/rocm/rocm.gpg.key | gpg --yes --dearmor --output /etc/apt/keyrings/rocm.gpg && \
echo "deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/rocm/apt/${ROCM_VERSION} jammy main" >> /etc/apt/sources.list.d/rocm.list && \
if [ "${ROCM_MAJOR_VERSION}" -ge 7 ]; then \
echo "deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/graphics/${ROCM_VERSION}/ubuntu jammy main" >> /etc/apt/sources.list.d/rocm.list \
; fi && \
echo "Package: *" >> /etc/apt/preferences.d/rocm-pin-600 && \
echo "Pin: release o=repo.radeon.com" >> /etc/apt/preferences.d/rocm-pin-600 && \
echo "Pin-Priority: 600" >> /etc/apt/preferences.d/rocm-pin-600 && \
# End setup steps for specific ROCm version - the packages below will be installed from the configured repositories
apt-get update && \
apt-get install -y --no-install-recommends \
rocm-hip-runtime \
rocblas-dev \
hipblas-dev && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* && \
echo "amd" > /run/localai/capability && \
echo "amd-rocm-${ROCM_MAJOR_VERSION}" > /run/localai/capability && \
# I have no idea why, but the ROCM lib packages don't trigger ldconfig after they install, which results in local-ai and others not being able
# to locate the libraries. We run ldconfig ourselves to work around this packaging deficiency
ldconfig \
Expand Down
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@

> :bulb: Get help - [❓FAQ](https://localai.io/faq/) [💭Discussions](https://github.com/go-skynet/LocalAI/discussions) [:speech_balloon: Discord](https://discord.gg/uJAeKSAGDy) [:book: Documentation website](https://localai.io/)
>
> [💻 Quickstart](https://localai.io/basics/getting_started/) [🖼️ Models](https://models.localai.io/) [🚀 Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) [🛫 Examples](https://github.com/mudler/LocalAI-examples) Try on
> [💻 Quickstart](https://localai.io/basics/getting_started/) [🖼️ Models](https://models.localai.io/) [🚀 Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) [🛫 Examples](https://github.com/mudler/LocalAI-examples) Try on
[![Telegram](https://img.shields.io/badge/Telegram-2CA5E0?style=for-the-badge&logo=telegram&logoColor=white)](https://t.me/localaiofficial_bot)

[![tests](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml)[![Build and Release](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/release.yaml)[![build container images](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml)[![Bump dependencies](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml/badge.svg)](https://github.com/go-skynet/LocalAI/actions/workflows/bump_deps.yaml)[![Artifact Hub](https://img.shields.io/endpoint?url=https://artifacthub.io/badge/repository/localai)](https://artifacthub.io/packages/search?repo=localai)
Expand Down Expand Up @@ -131,10 +131,10 @@ For more installation options, see [Installer Options](https://localai.io/instal
Or run with docker:

> **💡 Docker Run vs Docker Start**
>
>
> - `docker run` creates and starts a new container. If a container with the same name already exists, this command will fail.
> - `docker start` starts an existing container that was previously created with `docker run`.
>
>
> If you've already run LocalAI before and want to start it again, use: `docker start -i local-ai`

### CPU only image:
Expand Down Expand Up @@ -163,7 +163,7 @@ docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-nv
### AMD GPU Images (ROCm):

```bash
docker run -ti --name local-ai -p 8080:8080 --device=/dev/kfd --device=/dev/dri --group-add=video localai/localai:latest-gpu-hipblas
docker run -ti --name local-ai -p 8080:8080 --device=/dev/kfd --device=/dev/dri --group-add=video localai/localai:latest-gpu-amd-rocm-6
```

### Intel GPU Images (oneAPI):
Expand Down Expand Up @@ -194,7 +194,7 @@ docker run -ti --name local-ai -p 8080:8080 --gpus all localai/localai:latest-ai
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-gpu-intel

# AMD GPU version
docker run -ti --name local-ai -p 8080:8080 --device=/dev/kfd --device=/dev/dri --group-add=video localai/localai:latest-aio-gpu-hipblas
docker run -ti --name local-ai -p 8080:8080 --device=/dev/kfd --device=/dev/dri --group-add=video localai/localai:latest-aio-gpu-amd-rocm-6
```

For more information about the AIO images and pre-downloaded models, see [Container Documentation](https://localai.io/basics/container/).
Expand Down Expand Up @@ -254,7 +254,7 @@ Roadmap items: [List of issues](https://github.com/mudler/LocalAI/issues?q=is%3A
- 🗣 [Text to Audio](https://localai.io/features/text-to-audio/)
- 🔈 [Audio to Text](https://localai.io/features/audio-to-text/) (Audio transcription with `whisper.cpp`)
- 🎨 [Image generation](https://localai.io/features/image-generation)
- 🔥 [OpenAI-alike tools API](https://localai.io/features/openai-functions/)
- 🔥 [OpenAI-alike tools API](https://localai.io/features/openai-functions/)
- 🧠 [Embeddings generation for vector databases](https://localai.io/features/embeddings/)
- ✍️ [Constrained grammars](https://localai.io/features/constrained_grammars/)
- 🖼️ [Download Models directly from Huggingface ](https://localai.io/models/)
Expand Down Expand Up @@ -362,7 +362,7 @@ Other:
- Github bot which answer on issues, with code and documentation as context https://github.com/JackBekket/GitHelper
- Github Actions: https://github.com/marketplace/actions/start-localai
- Examples: https://github.com/mudler/LocalAI/tree/master/examples/


### 🔗 Resources

Expand Down
25 changes: 23 additions & 2 deletions backend/Dockerfile.llama-cpp
Original file line number Diff line number Diff line change
Expand Up @@ -63,9 +63,13 @@ ARG BUILD_TYPE
ENV BUILD_TYPE=${BUILD_TYPE}
ARG CUDA_MAJOR_VERSION
ARG CUDA_MINOR_VERSION
ARG ROCM_MAJOR_VERSION
ARG ROCM_MINOR_VERSION
ARG SKIP_DRIVERS=false
ENV CUDA_MAJOR_VERSION=${CUDA_MAJOR_VERSION}
ENV CUDA_MINOR_VERSION=${CUDA_MINOR_VERSION}
ENV ROCM_MAJOR_VERSION=${ROCM_MAJOR_VERSION}
ENV ROCM_MINOR_VERSION=${ROCM_MINOR_VERSION}
ENV DEBIAN_FRONTEND=noninteractive
ARG TARGETARCH
ARG TARGETVARIANT
Expand Down Expand Up @@ -201,10 +205,27 @@ RUN if [ "${BUILD_TYPE}" = "clblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then \
; fi

RUN if [ "${BUILD_TYPE}" = "hipblas" ] && [ "${SKIP_DRIVERS}" = "false" ]; then \
# Setup for specific ROCm version as described here: https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/install-methods/package-manager/package-manager-ubuntu.html
ROCM_VERSION="${ROCM_MAJOR_VERSION}.${ROCM_MINOR_VERSION}" && \
apt-get update && \
apt-get install -y --no-install-recommends \
hipblas-dev \
rocblas-dev && \
gpg wget && \
mkdir --parents --mode=0755 /etc/apt/keyrings && \
wget -qO - https://repo.radeon.com/rocm/rocm.gpg.key | gpg --yes --dearmor --output /etc/apt/keyrings/rocm.gpg && \
echo "deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/rocm/apt/${ROCM_VERSION} jammy main" >> /etc/apt/sources.list.d/rocm.list && \
if [ "${ROCM_MAJOR_VERSION}" -ge 7 ]; then \
echo "deb [arch=amd64 signed-by=/etc/apt/keyrings/rocm.gpg] https://repo.radeon.com/graphics/${ROCM_VERSION}/ubuntu jammy main" >> /etc/apt/sources.list.d/rocm.list \
; fi && \
echo "Package: *" >> /etc/apt/preferences.d/rocm-pin-600 && \
echo "Pin: release o=repo.radeon.com" >> /etc/apt/preferences.d/rocm-pin-600 && \
echo "Pin-Priority: 600" >> /etc/apt/preferences.d/rocm-pin-600 && \
# End setup steps for specific ROCm version
apt-get update && \
apt-get install -y --no-install-recommends \
# Build dependencies
rocm-developer-tools \
rocm-hip-runtime-dev \
rocm-hip-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* && \
# I have no idea why, but the ROCM lib packages don't trigger ldconfig after they install, which results in local-ai and others not being able
Expand Down
Loading
Loading