-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for loading onnx files with the tensorRT backend #4594
Comments
I think what you want is already implemented : |
@joihn ORT is short for onnxruntime, what I'm asking is to use tensorrt with onnxs but without onnxruntime |
+1 |
@tanmayv25 thoughts? |
I would not like to complicate TensorRT backend to consume onnx files and own the conversion within TensorRT backend. ORT already support TRT execution provider. |
Closing this issue due to lack of activity. If this issue needs follow-up, please let us know and we can reopen it for you. |
Describe the solution you'd like
Be able to just use an onnx file in the tensorrt backend given that tensorrt has a onnx parser. It would build the engine on warmup and cache it.
Describe alternatives you've considered
Using the onnxruntime backend instead, but it has problems #4587 and microsoft/onnxruntime#11356
The text was updated successfully, but these errors were encountered: