You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 26, 2019. It is now read-only.
I would like to load onnx models, and run them (on CPU). Mostly for testing: I would like to compare the outputs obtained with my (C++) implementations. Not urgent.
In MXNet, at least, ONNX load is supported only via Python API. Not sure about other frameworks.
However, you can write a simple script that exports the onnx model to MXNet model, and then you can load the MXNet model with the various MXNet language APIs such as Scala or C++, and there is a Java API coming soon. You can also use MXNet Model Server that supports ONNX out of the box (no conversions needed).
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
In particular I'd like to load an .onnx file in C++. Thanks.
The text was updated successfully, but these errors were encountered: