-
Notifications
You must be signed in to change notification settings - Fork 545
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Errors on converting onnx to trt : 'Assertion failed: ctx->tensors().count(inputName)' #469
Comments
torch==1.5 works for me |
same error,even my model only have one layer: ` class Resize_Test(nn.Module):
` accroding to this issue : https://github.com/NVIDIA/TensorRT/issues/422, I throught this project is a solution,but I got the same error when use onnx2trt command. Any idea? |
What TRT version are you using? This should be fixed in later onnx-tensorrt releases. |
Hi, i meet the same error, the version of tensorrt i used is 7.2.2.3, i use python onnx-tensorrt which is the latest master branch, the installed onnx-tensorrt version is 7.2.2.3.0, i wonder when this issue will be fixed? |
Can you provide your ONNX model or the scripts to generate your ONNX model? |
Closing due to inactivity. If you are still having issues with the latest master branch feel free to open a new one. |
My env is :
torch:1.3 onnx:1.7 opset:11
My onnx is like this:
In my pytorch export onnx code,my net-work use:
y = F.interpolate(y, size=sources[2].size()[2:], mode='bilinear', align_corners=False)
when i convert onnx to trt, it have errors:
While parsing node number 63 [Resize]: ERROR: ModelImporter.cpp:124 In function parseGraph: [5] Assertion failed: ctx->tensors().count(inputName)
But if use opset10. It can convert trt is ok, trt infer result is different with the python infer result.
It is so sad.
Gays,Do you meet this errors? Please help me, Thanks very much.
The text was updated successfully, but these errors were encountered: