Skip to content

Commit

Permalink
ci: 1.20.1 release (#70)
Browse files Browse the repository at this point in the history
* ci: 1.20.1 release
  • Loading branch information
kibae authored Nov 24, 2024
1 parent e1b8546 commit cdd01c8
Show file tree
Hide file tree
Showing 9 changed files with 27 additions and 21 deletions.
20 changes: 13 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# ONNX Runtime Server

[![ONNX Runtime](https://img.shields.io/github/v/release/microsoft/onnxruntime?filter=v1.20.0&label=ONNX%20Runtime)](https://github.com/microsoft/onnxruntime)
[![ONNX Runtime](https://img.shields.io/github/v/release/microsoft/onnxruntime?filter=v1.20.1&label=ONNX%20Runtime)](https://github.com/microsoft/onnxruntime)
[![CMake on Linux](https://github.com/kibae/onnxruntime-server/actions/workflows/cmake-linux.yml/badge.svg)](https://github.com/kibae/onnxruntime-server/actions/workflows/cmake-linux.yml)
[![CMake on MacOS](https://github.com/kibae/onnxruntime-server/actions/workflows/cmake-macos.yml/badge.svg)](https://github.com/kibae/onnxruntime-server/actions/workflows/cmake-macos.yml)
[![License](https://img.shields.io/github/license/kibae/onnxruntime-server)](https://github.com/kibae/onnxruntime-server/blob/main/LICENSE)
Expand Down Expand Up @@ -68,9 +68,15 @@ brew install onnxruntime
#### Ubuntu/Debian

```shell
sudo apt install cmake pkg-config libboost-all-dev libssl-dev
# optional, for Nvidia GPU support
sudo apt install nvidia-cuda-toolkit nvidia-cudnn
sudo apt install cmake pkg-config libboost-all-dev libssl-dev
```

##### (optional) CUDA support (CUDA 12.x, cuDNN 9.x)
- Follow the instructions below to install the CUDA Toolkit and cuDNN.
- [CUDA Toolkit Installation Guide](https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html)
- [CUDA Download for Ubuntu](https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&Distribution=Ubuntu&target_version=22.04&target_type=deb_network)
```shell
sudo apt install cuda-toolkit-12 libcudnn9-dev-cuda-12
# optional, for Nvidia GPU support with Docker
sudo apt install nvidia-container-toolkit
```
Expand Down Expand Up @@ -158,11 +164,11 @@ sudo cmake --install build --prefix /usr/local/onnxruntime-server
# Docker

- Docker hub: [kibaes/onnxruntime-server](https://hub.docker.com/r/kibaes/onnxruntime-server)
- [`1.20.0-linux-cuda12`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cuda12.dockerfile) amd64(CUDA 12.x, cuDNN 9.x)
- [`1.20.0-linux-cpu`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cpu.dockerfile) amd64, arm64
- [`1.20.1-linux-cuda12`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cuda12.dockerfile) amd64(CUDA 12.x, cuDNN 9.x)
- [`1.20.1-linux-cpu`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cpu.dockerfile) amd64, arm64

```shell
DOCKER_IMAGE=kibae/onnxruntime-server:1.20.0-linux-cuda12 # or kibae/onnxruntime-server:1.20.0-linux-cpu
DOCKER_IMAGE=kibae/onnxruntime-server:1.20.1-linux-cuda12 # or kibae/onnxruntime-server:1.20.1-linux-cpu

docker pull ${DOCKER_IMAGE}

Expand Down
2 changes: 1 addition & 1 deletion deploy/build-docker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## x64 with CUDA

- [ONNX Runtime Binary](https://github.com/microsoft/onnxruntime/releases) v1.20.0(latest) requires CUDA 11/12, cudnn 8/9.
- [ONNX Runtime Binary](https://github.com/microsoft/onnxruntime/releases) v1.20.1(latest) requires CUDA 11/12, cudnn 8/9.
```
$ ldd libonnxruntime_providers_cuda.so
linux-vdso.so.1 (0x00007fffa4bf8000)
Expand Down
2 changes: 1 addition & 1 deletion deploy/build-docker/VERSION
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
export VERSION=1.20.0
export VERSION=1.20.1
export IMAGE_PREFIX=kibaes/onnxruntime-server
4 changes: 2 additions & 2 deletions deploy/build-docker/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ services:
onnxruntime_server_simple:
# After the docker container is up, you can use the REST API (http://localhost:8080).
# API documentation will be available at http://localhost:8080/api-docs.
image: kibaes/onnxruntime-server:1.20.0-linux-cuda12
image: kibaes/onnxruntime-server:1.20.1-linux-cuda12
ports:
- "8080:80" # for http backend
volumes:
Expand All @@ -29,7 +29,7 @@ services:
onnxruntime_server_advanced:
# After the docker container is up, you can use the REST API (http://localhost, https://localhost).
# API documentation will be available at http://localhost/api-docs.
image: kibaes/onnxruntime-server:1.20.0-linux-cuda12
image: kibaes/onnxruntime-server:1.20.1-linux-cuda12
ports:
- "80:80" # for http backend
- "443:443" # for https backend
Expand Down
2 changes: 1 addition & 1 deletion deploy/build-docker/linux-cpu.dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ RUN case ${TARGETPLATFORM} in \
esac

RUN cmake -DBoost_USE_STATIC_LIBS=ON -DOPENSSL_USE_STATIC_LIBS=ON -B build -S . -DCMAKE_BUILD_TYPE=Release
RUN cmake --build build --parallel 4 --target onnxruntime_server_standalone
RUN cmake --build build --parallel 8 --target onnxruntime_server_standalone
RUN cmake --install build --prefix /app/onnxruntime-server

# target
Expand Down
2 changes: 1 addition & 1 deletion deploy/build-docker/linux-cuda12.dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ RUN case ${TARGETPLATFORM} in \
esac

RUN cmake -DCUDA_SDK_ROOT_DIR=/usr/local/cuda-12 -DBoost_USE_STATIC_LIBS=ON -DOPENSSL_USE_STATIC_LIBS=ON -B build -S . -DCMAKE_BUILD_TYPE=Release
RUN cmake --build build --parallel 4 --target onnxruntime_server_standalone
RUN cmake --build build --parallel 8 --target onnxruntime_server_standalone
RUN cmake --install build --prefix /app/onnxruntime-server

# target
Expand Down
12 changes: 6 additions & 6 deletions docs/docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@

# Supported tags and respective Dockerfile links

- [`1.20.0-linux-cuda12`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cuda12.dockerfile) amd64(CUDA 12.x, cuDNN 9.x)
- [`1.20.0-linux-cpu`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cpu.dockerfile) amd64, arm64
- [`1.20.1-linux-cuda12`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cuda12.dockerfile) amd64(CUDA 12.x, cuDNN 9.x)
- [`1.20.1-linux-cpu`](https://github.com/kibae/onnxruntime-server/blob/main/deploy/build-docker/linux-cpu.dockerfile) amd64, arm64

# How to use this image

Expand All @@ -28,7 +28,7 @@
- API documentation will be available at http://localhost/api-docs.

```shell
DOCKER_IMAGE=kibae/onnxruntime-server:1.20.0-linux-cuda12 # or kibae/onnxruntime-server:1.20.0-linux-cpu
DOCKER_IMAGE=kibae/onnxruntime-server:1.20.1-linux-cuda12 # or kibae/onnxruntime-server:1.20.1-linux-cpu

docker pull ${DOCKER_IMAGE}

Expand Down Expand Up @@ -69,7 +69,7 @@ services:
onnxruntime_server_simple:
# After the docker container is up, you can use the REST API (http://localhost:8080).
# API documentation will be available at http://localhost:8080/api-docs.
image: kibaes/onnxruntime-server:1.20.0-linux-cuda12
image: kibaes/onnxruntime-server:1.20.1-linux-cuda12
ports:
- "8080:80" # for http backend
volumes:
Expand Down Expand Up @@ -100,8 +100,8 @@ services:

onnxruntime_server_advanced:
# After the docker container is up, you can use the REST API (http://localhost, https://localhost).
# API documentation will be available at http://localhost/api-docs.
image: kibaes/onnxruntime-server:1.20.0-linux-cuda12
# API documentation wl be available at http://localhost/api-docs.
image: kibaes/onnxruntime-server:1.20.1-linux-cuda12
ports:
- "80:80" # for http backend
- "443:443" # for https backend
Expand Down
2 changes: 1 addition & 1 deletion docs/swagger/openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ openapi: 3.0.3
info:
title: ONNX Runtime Server
description: |-
version: 1.20.0
version: 1.20.1
externalDocs:
description: ONNX Runtime Server
url: https://github.com/kibae/onnxruntime-server
Expand Down
2 changes: 1 addition & 1 deletion src/test/test_lib_version.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,5 @@
#include "./test_common.hpp"

TEST(test_lib_version, LibVersion) {
EXPECT_EQ(onnxruntime_server::onnx::version(), "1.20.0");
EXPECT_EQ(onnxruntime_server::onnx::version(), "1.20.1");
}

0 comments on commit cdd01c8

Please sign in to comment.