🛠 A lite C++ toolkit of awesome AI models, support ONNXRuntime, MNN, TNN, NCNN and TensorRT.
-
Updated
Oct 28, 2024 - C++
🛠 A lite C++ toolkit of awesome AI models, support ONNXRuntime, MNN, TNN, NCNN and TensorRT.
An OBS plugin for removing background in portrait images (video), making it easy to replace the background when recording or streaming.
It is a simple library to speed up CLIP inference up to 3x (K80 GPU)
「PyTorch Implementation of AnimeGANv2」のPythonでのONNX推論サンプル
Python scripts for performing 6D pose estimation and shape reconstruction using the CenterSnap model in ONNX
「PyTorch Implementation of AnimeGANv2」を用いて、生成した顔画像を元の画像に上書きするデモ
Supercharge Your PyTorch Image Models: Bag of Tricks to 8x Faster Inference with ONNX Runtime & Optimizations
Tennis match analysis via computer vision techniques.
Tools for simple inference testing using TensorRT, CUDA and OpenVINO CPU/GPU and CPU providers. Simple Inference Test for ONNX.
Python scripts performing semantic segmentation using the TopFormer model in ONNX.
Drop-in replacement for onnxruntime-node with GPU support using CUDA or DirectML
Python scripts for performing Image Inpainting using the MST model in ONNX
Text Detection and Recognition using ONNX
Jetson Nano Setup without Monitor for JetBot Build. JupyterLab, ROS2 Dasing, Torch, Torch2trt, ONNX, ONNXRuntime-GPU and TensorFlow Installation Included. JupyterLab doesn't require Docker Container.
YOLOv8 inference using ONNX Runtime
Contains ROS2 packages to run robot car to collect annotated camera images while controlled by Gamepad. Also contains packages to run a robot car autonomously with a trained neural network.
Perform inference with TwinLiteNet model using ONNX Runtime. TwinLiteNet is a lightweight and efficient deep learning model designed for drivable area and lane segmentation
This is a deepfake tool that implements the swapping and restoration of faces in images and videos by InsightFace and GFPGAN solutions.
EfficientViTSAM inference using ONNXRuntime
This project makes it easier to make detections using onnx models and cuda's speed.
Add a description, image, and links to the onnxruntime-gpu topic page so that developers can more easily learn about it.
To associate your repository with the onnxruntime-gpu topic, visit your repo's landing page and select "manage topics."