Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for loading onnx files with the tensorRT backend #4594

Closed
fran6co opened this issue Jul 6, 2022 · 6 comments
Closed

Add support for loading onnx files with the tensorRT backend #4594

fran6co opened this issue Jul 6, 2022 · 6 comments

Comments

@fran6co
Copy link

fran6co commented Jul 6, 2022

Describe the solution you'd like
Be able to just use an onnx file in the tensorrt backend given that tensorrt has a onnx parser. It would build the engine on warmup and cache it.

Describe alternatives you've considered
Using the onnxruntime backend instead, but it has problems #4587 and microsoft/onnxruntime#11356

@joihn
Copy link

joihn commented Jul 6, 2022

@fran6co
Copy link
Author

fran6co commented Jul 6, 2022

@joihn ORT is short for onnxruntime, what I'm asking is to use tensorrt with onnxs but without onnxruntime

@yaysummeriscoming
Copy link

+1

@jbkyang-nvi
Copy link
Contributor

@tanmayv25 thoughts?

@tanmayv25
Copy link
Contributor

I would not like to complicate TensorRT backend to consume onnx files and own the conversion within TensorRT backend. ORT already support TRT execution provider.

@jbkyang-nvi
Copy link
Contributor

Closing this issue due to lack of activity. If this issue needs follow-up, please let us know and we can reopen it for you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

5 participants