Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TRTIS should support TensorRT models that require custom plugins #16

Closed
deadeyegoodwin opened this issue Dec 14, 2018 · 11 comments
Closed
Labels
enhancement New feature or request

Comments

@deadeyegoodwin
Copy link
Contributor

No description provided.

@deadeyegoodwin deadeyegoodwin added the enhancement New feature or request label Dec 15, 2018
@zoidburg
Copy link

As mentioned in #12 , I got some problem with SSD TRT plan distribution on TRTIS. And yes, there are several custom plugins in the SSD sample such as NMS, Normalize, PriorBox and so on. While serving with the engine plan serialized from the SSD sample, it seems that TRTIS did not deserialize these part correctly.

It would be great if the TRT plugin mechanism were fully supported in TRTIS. And I think a complete tutorial with SSD TRT plan might be helpful, as detection task is quite common in these days.

@deadeyegoodwin
Copy link
Contributor Author

deadeyegoodwin commented Feb 25, 2019

As a work-around for TRTIS not yet having TRT plugin support, you should be able to use LD_PRELOAD in many cases. For example, assuming your TRT plugins are compiled into trtplugins.so.

$ LD_PRELOAD=trtplugins.so trtserver --model-store=/tmp/models ...

You can load multiple plugin libraries with LD_PRELOAD. The limitation of this approach is that the plugins must be managed separately from the model store itself. And more seriously, if there are plugin name conflicts across multiple plugin libraries there is no way to handle it. But if you have just a single plugin library this should get you unblocked.

@FamousDirector
Copy link

How would you suggest I compile plugins into trtplugins.so?

@deadeyegoodwin
Copy link
Contributor Author

The TensorRT documentation has description and examples on how to create TensorRT plugin libraries: https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#extending

@IMG-PRCSNG
Copy link

IMG-PRCSNG commented Jun 1, 2019

Has anyone tried using libnvinfer_plugin.so in LD_PRELOAD? I am unable to get it to work even though it has all the plugins.

Still stuck at

tensorrt_1  | E0601 15:02:09.585748 34 logging.cc:43] getPluginCreator could not find plugin Normalize_TRT version 1 namespace
tensorrt_1  | E0601 15:02:09.585764 34 logging.cc:43] Cannot deserialize plugin Normalize_TRT

and similar loglines for NMS_TRT and PriorBox_TRT

@DebashisGanguly
Copy link

I am still facing the same issue:
LD_PRELOAD=libnvinfer_plugin.so /opt/tensorrtserver/bin/trtserver --model-store=/ScriptsNModels/TestModelZoo

E0612 20:28:44.454662 46 logging.cc:43] getPluginCreator could not find plugin Normalize_TRT version 1 namespace

E0612 20:28:44.454670 46 logging.cc:43] Cannot deserialize plugin Normalize_TRT

E0612 20:28:44.454697 46 logging.cc:43] getPluginCreator could not find plugin PriorBox_TRT version 1 namespace

E0612 20:28:44.454715 46 logging.cc:43] Cannot deserialize plugin PriorBox_TRT

E0612 20:28:44.454736 46 logging.cc:43] getPluginCreator could not find plugin PriorBox_TRT version 1 namespace

E0612 20:28:44.454742 46 logging.cc:43] Cannot deserialize plugin PriorBox_TRT

E0612 20:28:44.454763 46 logging.cc:43] getPluginCreator could not find plugin PriorBox_TRT version 1 namespace

E0612 20:28:44.454777 46 logging.cc:43] Cannot deserialize plugin PriorBox_TRT

E0612 20:28:44.454795 46 logging.cc:43] getPluginCreator could not find plugin PriorBox_TRT version 1 namespace

E0612 20:28:44.454813 46 logging.cc:43] Cannot deserialize plugin PriorBox_TRT

E0612 20:28:44.454831 46 logging.cc:43] getPluginCreator could not find plugin PriorBox_TRT version 1 namespace

E0612 20:28:44.454838 46 logging.cc:43] Cannot deserialize plugin PriorBox_TRT

E0612 20:28:44.454857 46 logging.cc:43] getPluginCreator could not find plugin PriorBox_TRT version 1 namespace

E0612 20:28:44.454864 46 logging.cc:43] Cannot deserialize plugin PriorBox_TRT

E0612 20:28:44.454900 46 logging.cc:43] getPluginCreator could not find plugin NMS_TRT version 1 namespace

E0612 20:28:44.454907 46 logging.cc:43] Cannot deserialize plugin NMS_TRT

Note that I have a serialized plan file for SSD from running sampleSSD in TensorRT code.

Can anyone also provide a config.pbtxt for SSD?

@clancylian
Copy link

I have implemented IPluginFactory with plugin and set
ICaffeParser* parser = createCaffeParser(); parser->setPluginFactoryExt(&pluginFactory);

and generate tensorrt model . when I employ it to server, it occur error
ERROR: Not a valid serialized plugin

And I also implemented plugin with IPluginV2 IPluginCreator. and generate lib.so
and use LD_PRELOAD=trtplugins.so but also occur the same error!

@deadeyegoodwin
Copy link
Contributor Author

We're working on validating TRT plugin support in TRTIS. It appears to work in some cases but not all. We'll update here once we learn more.

@CoderHam
Copy link
Contributor

We will be adding support for the default provided plugins shipped with TensorRT for now. We may add custom plugins support in the future if possible.
This means that plugins mentioned here will be supported by default in TRTIS in the future releases.

@CoderHam
Copy link
Contributor

Support for default plugins as mentioned here is added on master and will be in 19.07. Please retry your model/s to let me know if it is not fixed for you.

@seovchinnikov
Copy link

I've successfully imported custom ops imported from onnx parser and registered there via LD_PRELOAD workaround.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Development

No branches or pull requests

8 participants