-
Notifications
You must be signed in to change notification settings - Fork 351
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat: Makefile for trtorchrt.so example
Signed-off-by: Naren Dasan <naren@narendasan.com> Signed-off-by: Naren Dasan <narens@nvidia.com>
- Loading branch information
1 parent
8581fd9
commit c60c521
Showing
11 changed files
with
99 additions
and
61 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
package(default_visibility = ["//visibility:public"]) | ||
|
||
cc_binary( | ||
name = "trtorchrt_example", | ||
srcs = [ | ||
"main.cpp" | ||
], | ||
deps = [ | ||
"//core/runtime:runtime", | ||
"@libtorch//:libtorch", | ||
"@libtorch//:caffe2", | ||
"@tensorrt//:nvinfer", | ||
], | ||
) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
CXX=g++ | ||
DEP_DIR=$(PWD)/deps | ||
INCLUDE_DIRS=-I$(DEP_DIR)/libtorch/include -I$(DEP_DIR)/trtorch/include | ||
LIB_DIRS=-L$(DEP_DIR)/trtorch/lib -L$(DEP_DIR)/libtorch/lib # -Wl,-rpath $(DEP_DIR)/tensorrt/lib -Wl,-rpath $(DEP_DIR)/cudnn/lib64 | ||
LIBS=-Wl,--no-as-needed -ltrtorchrt -Wl,--as-needed -ltorch -ltorch_cuda -ltorch_cpu -ltorch_global_deps -lbackend_with_compiler -lc10 -lc10_cuda | ||
SRCS=main.cpp | ||
|
||
TARGET=trtorchrt_example | ||
|
||
$(TARGET): | ||
$(CXX) $(SRCS) $(INCLUDE_DIRS) $(LIB_DIRS) $(LIBS) -o $(TARGET) | ||
|
||
clean: | ||
$(RM) $(TARGET) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,53 @@ | ||
# trtorchrt_example | ||
|
||
## Sample application which uses TRTorch runtime library and plugin library. | ||
|
||
This sample is a demonstration on how to use TRTorch runtime library `libtrtorchrt.so` along with plugin library `libtrtorch_plugins.so` | ||
|
||
In this demo, we convert two models `ConvGelu` and `Norm` to TensorRT using TRTorch python API and perform inference using `samplertapp`. In these models, `Gelu` and `Norm` layer are expressed as plugins in the network. | ||
|
||
### Generating Torch script modules with TRT Engines | ||
|
||
The following command will generate `conv_gelu.jit` and `norm.jit` torchscript modules which contain TensorRT engines. | ||
|
||
```sh | ||
python network.py | ||
``` | ||
|
||
### `trtorchrt_example` | ||
The main goal is to use TRTorch runtime library `libtrtorchrt.so`, a lightweight library sufficient enough to deploy your Torchscript programs containing TRT engines. | ||
|
||
1) Download releases of LibTorch and TRTorch from https://pytorch.org and the TRTorch github repo and unpack both in the deps directory. | ||
|
||
```sh | ||
cd examples/trtorchrt_example/deps | ||
// Download latest TRTorch release tar file (libtrtorch.tar.gz) from https://github.com/NVIDIA/TRTorch/releases | ||
tar -xvzf libtrtorch.tar.gz | ||
unzip libtorch-cxx11-abi-shared-with-deps-1.9.0+cu111.zip | ||
``` | ||
|
||
> If cuDNN and TensorRT are not installed on your system / in your LD_LIBRARY_PATH then do the following as well | ||
```sh | ||
cd deps | ||
mkdir cudnn && tar -xvzf <cuDNN TARBALL> --directory cudnn --strip-components=1 | ||
mkdir tensorrt && tar -xvzf <TensorRT TARBALL> --directory tensorrt --strip-components=1 | ||
cd .. | ||
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$(pwd)/deps/trtorch/lib:$(pwd)/deps/libtorch/lib:$(pwd)/deps/tensorrt/lib:$(pwd)/deps/cudnn/lib64:/usr/local/cuda/lib | ||
``` | ||
|
||
This gives maximum compatibility with system configurations for running this example but in general you are better off adding `-Wl,-rpath $(DEP_DIR)/tensorrt/lib -Wl,-rpath $(DEP_DIR)/cudnn/lib64` to your linking command for actual applications | ||
|
||
2) Build and run `trtorchrt_example` | ||
|
||
`trtorchrt_example` is a binary which loads the torchscript modules `conv_gelu.jit` or `norm.jit` and runs the TRT engines on a random input using TRTorch runtime components. Checkout the `main.cpp` and `Makefile ` file for necessary code and compilation dependencies. | ||
|
||
To build and run the app | ||
|
||
```sh | ||
cd examples/trtorchrt_example | ||
make | ||
# If paths are different than the ones below, change as necessary | ||
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$(pwd)/deps/trtorch/lib:$(pwd)/deps/libtorch/lib:$(pwd)/deps/tensorrt/lib:$(pwd)/deps/cudnn/lib64:/usr/local/cuda/lib | ||
./trtorchrt_example $PWD/examples/trtorchrt_example/norm.jit | ||
``` |
Empty file.
File renamed without changes.
File renamed without changes.