Fast Yolov3 CPU Inference Using Onnxruntime #7317
matankley
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I would like to thank you and share with you few insights I observed while testing inference of Yolov3 Object detection models using opencv-dnn and onnxruntime on CPU.
Onnxruntime inference is significantly faster than the common opencv-dnn and darknet that are usually used for yolov3 models.
Here is a link to the full article - (https://medium.com/towards-artificial-intelligence/yolov3-cpu-inference-performance-comparison-onnx-opencv-darknet-6764f2bde33e)
Hope this can help you achieve reasonable performance on low-powered devices.
Beta Was this translation helpful? Give feedback.
All reactions