Skip to content

project-zetton/zetton-inference-tensorrt

zetton-inference-tensorrt

English | 中文

Table of Contents

Introduction

zetton-inference-tensorrt is an open source extension of zetton-inference package that enables deep learning inference using the TensorRT framework. It's a part of the Project Zetton.

What's New

Please refer to changelog.md for details and release history.

For compatibility changes between different versions of zetton-inference-tensorrt, please refer to compatibility.md.

Installation

Please refer to Installation for installation instructions.

Getting Started

Please see get_started.md for the basic usage of zetton-inference-tensorrt.

Overview of Benchmark and Model Zoo

NVIDIA Jetson Xavier NX

Task Model FP32 FP16 INT8
Detection YOLOv5
Detection YOLOX
Detection YOLOv7
Tracking DeepSORT
Tracking ByteTrack

x86 with NVIDIA A6000 GPU

Task Model FP32 FP16 INT8
Detection YOLOv5
Detection YOLOX
Detection YOLOv7
Tracking DeepSORT
Tracking ByteTrack

FAQ

Please refer to FAQ for frequently asked questions.

Contributing

We appreciate all contributions to improve zetton-inference-tensorrt. Please refer to CONTRIBUTING.md for the contributing guideline.

Acknowledgement

We appreciate all the contributors who implement their methods or add new features, as well as users who give valuable feedbacks. We wish that the package and benchmark could serve the growing research and production community by providing a flexible toolkit to deploy models.

License

  • For academic use, this project is licensed under the 2-clause BSD License, please see the LICENSE file for details.

  • For commercial use, please contact Yusu Pan.

Related Projects

About

TensorRT-based model inference nodes for Project Zetton.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published