This repository is optimized inference code for eipl
- Jetson Xavier NX, Orin Nano, AGX Orin
- JetPack 5.* (Do not use DEVELOPER PREVIEW version)
- See requrements_jetson.txt
Install python packages using the following command.
pip install -r requirements_jetson.txt
Install packages for tensorrt inference. You need to install those packages with sudo.
sudo apt install python3-libnvinfer python3-libnvinfer-dev ffmpeg libopenblas-base
sudo pip install jetson-stats
Download sample data and pretrained weights using the following command.
python scripts/downloader.py
Convert pretrained weights to onnx format using the following command.
python pytorch/export_onnx.py
Try this if you do not have enough packages.
sudo apt install nvidia-jetpack
- SARNN
- CNNRNN
- CNNRNNLN
- Experimental CAEBN (no RNN)
- Do not work with Int8 TensorRT inference