-
Notifications
You must be signed in to change notification settings - Fork 202
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Documentation on how to use ONNX models #108
Comments
It does not help at all. Where should I put the ONNX models ? What bits and pieces of code should I change ? Thanks |
No it does not supports read ONNX model, it supports read paddle model and convert into ONNX and then inference. |
I don't quite understand. |
when you specify PaddleDevice.Onnx(), it will convert paddle model into ONNX model in memory, and then inference the ONNX model in memory. |
I assume that would slow down the process quite a lot. Se there is no added value in using ONNX I can immagine. |
It should be quite fast because I noticed in most of scenarios the speed is quite fast even compares to mkldnn |
Correct me I am wrong. I would have to use the online models. Am I right ? |
Feature request type
sample request
Is your feature request related to a problem? Please describe
In the documentation there is always a reference to the
Mkldnn
usage but, apparently, the device also support ONNX.I don't seem to be able to find any sample code or explanation on how to use it.
I've fiddled with the code replacing the device but it does not work.
Describe the solution you'd like
Some sample code and some documentation for the ONNX integration.
Describe alternatives you've considered
RapidOCR (c# integration) which provides a solution for ONNX models.
Additional context
No response
The text was updated successfully, but these errors were encountered: