An ncnn implementation of YOLOv5 on ARM devices, capable of using GPU to accelerate inference
- Ubuntu 18.04 (x86_64)
- Ubuntu 16.04 (aarch64)
- OpenCV 3.2.0
- CMake 3.10.0
-
The compilation of the project should be on the ARM device.
-
Install OpenCV.
sudo apt-get install libopencv-dev
-
cd YOLOv5ncnn
-
Edit "CMakeLists.txt" to configure correctly.
-
Compile and run.
cd build cmake .. make ./../bin/YOLOv5ncnn
-
The compilation of ncnn should be on the x86 device.
-
Install OpenCV.
sudo apt-get install libopencv-dev
-
Install protobuf.
sudo apt install protobuf-compiler libprotobuf-dev
-
Download source code of ncnn from https://github.com/Tencent/ncnn/releases.
unzip ncnn-master.zip
-
Download gcc-arm-toolchain and add to environment variables.
tar -zxvf gcc-arm-8.2-2018.11-x86_64-aarch64-linux-gnu.tar.xz gedit ~/.bashrc export PATH=$PATH:/home/username/gcc-arm-8.2-2018.11-x86_64-aarch64-linux-gnu/bin source ~/.bashrc
-
Compile ncnn.
cd ncnn mkdir -p build-aarch64-linux cd build-aarch64-linux cmake -DCMAKE_TOOLCHAIN_FILE=../toolchains/aarch64-linux-gnu.toolchain.cmake –DANDROID=ON .. make -j8 make install
-
The compilation of the project should be on the ARM device.
-
Install OpenCV.
sudo apt-get install libopencv-dev
-
cd YOLOv5ncnn-vulkan
-
Edit "CMakeLists.txt" to configure correctly.
-
Compile and run.
cd build cmake .. make ./../bin/YOLOv5ncnn-vulkan
-
The compilation of ncnn-vulkan should be on the x86 device.
-
Install protobuf.
sudo apt install protobuf-compiler libprotobuf-dev
-
Install OpenCV.
sudo apt-get install libopencv-dev
-
Download vulkan-sdk from https://vulkan.lunarg.com/sdk/home#sdk/downloadConfirm/1.2.148.0/linux/vulkansdk-linux-x86_64-1.2.148.0.tar.gz. and add to environment variables (reboot may needed).
export VULKAN_SDK=~/vulkan-sdk-1.2.148.0/x86_64 export PATH=$PATH:$VULKAN_SDK/bin export LIBRARY_PATH=$LIBRARY_PATH$:VULKAN_SDK/lib export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$VULKAN_SDK/lib export VK_LAYER_PATH=$VULKAN_SDK/etc/vulkan/explicit_layer.d
-
Download source code of ncnn from https://github.com/Tencent/ncnn/releases.
unzip ncnn-master.zip
-
Download gcc-arm-toolchain and add to environment variables.
tar -zxvf gcc-arm-8.2-2018.11-x86_64-aarch64-linux-gnu.tar.xz gedit ~/.bashrc export PATH=$PATH:/home/username/gcc-arm-8.2-2018.11-x86_64-aarch64-linux-gnu/bin source ~/.bashrc
-
Compile ncnn-vulkan.
cd ncnn mkdir -p build-aarch64-linux-vulkan cd build-aarch64-linux-vulkan cmake -DCMAKE_TOOLCHAIN_FILE=../toolchains/aarch64-linux-gnu.toolchain.cmake –DANDROID=ON -DNCNN_VULKAN=ON .. make -j8 make install
-
In order to compile the project correctly on ARM devices, additional static link libraries (libvulkan-sdk.a and libvulkan-stub.a) are needed and can be obtained from [here](ARM-software/vulkan-sdk: Github repository for the Vulkan SDK).
We train a model in Pytorch and first convert to onnx and then to ncnn.
-
For how to train in Pytorch and export to onnx, see https://github.com/ultralytics/yolov5.
-
Because ncnn has limited support for operators, the network definition needs to be modified before training, please modify "common.py".
from
class Focus(nn.Module): def __init__(self, c1, c2, k=1, s=1, p=None, g=1, act=True): super(Focus, self).__init__() self.conv = Conv(c1 * 4, c2, k, s, p, g, act) def forward(self, x): return self.conv(torch.cat([x[..., ::2, ::2], x[..., ::2, ::2], x[..., ::2, ::2], x[..., ::2, ::2]], 1))
to
class Focus(nn.Module): def __init__(self, c1, c2, k=1, s=1, p=None, g=1, act=True): super(Focus, self).__init__() self.conv = Conv(c1 * 4, c2, k, s, p, g, act) def forward(self, x): return self.conv(torch.cat([torch.nn.functional.interpolate(x, scale_factor=0.5), torch.nn.functional.interpolate(x, scale_factor=0.5), torch.nn.functional.interpolate(x, scale_factor=0.5), torch.nn.functional.interpolate(x, scale_factor=0.5)], 1))
-
When export to onnx, Detect layer should be removed from the graph, please modify "export.py".
model.model[-1].export = True
-
Simplify the onnx model by onnx-simplifier.
pip3 install onnx-simplifier python3 -m onnxsim yolov5s.onnx yolov5s.onnx
-
Convert onnx to ncnn
./onnx2ncnn yolov5s.onnx yolov5s.param yolov5s.bin