You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running into some issues running inference using onnxruntime.
After training a mask-rcnn model, I created an ONNX export using mmdeploy with the following flags:
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am running into some issues running inference using onnxruntime.
After training a mask-rcnn model, I created an ONNX export using mmdeploy with the following flags:
It is unclear to me whether the preprocessing is baked into the exported model or not.
I have tried the following preprocessing steps:
but I am getting different results compared to inference with mmdet.apis.inference_detector.
What is the expected input to the ONNX model?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions