Deploying without or less dependency with MMdeploy, MMCV, MMEngine, etc. #2745
Unanswered
RiverLight4
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I'd like to convert MMDetection trained model into other acceleration hardware platform.
But I'm afraid that some inference engine cannot use
mmdeploy.dll
orlibmmdeploy_tensorrt_ops.so
,mmdeploy_runtime
, even ifmmcv
python modules, when inference is running on the hardware.Here is my question:
Additional Info:
libmmdeploy_tensorrt_ops.so
orlibmmdeploy_onnx_ops.so
is always needed.Beta Was this translation helpful? Give feedback.
All reactions