Skip to content

❓ [Question] Is SAM2 supported when compiling with the Dynamo backend on JetPack 6.1 or 6.2? #3478

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
AyanamiReiFan opened this issue Apr 17, 2025 · 3 comments
Labels
question Further information is requested

Comments

@AyanamiReiFan
Copy link

AyanamiReiFan commented Apr 17, 2025

❓ Question

Will SAM2 be compatible with the Dynamo backend on JetPack 6.1/6.2?

Are there any workarounds for the TensorRT version mismatch?

What you have already tried

Here are my attempts and issues encountered, my device is jetson AGX Orin, I only compile the ImageEncoder (Hiera & FPN which remove position_encoding) of SAM2, the SAM2 code is from https://github.com/chohk88/sam2/tree/torch-trt:

JetPack 6.1 + PyTorch 2.5 (from https://developer.download.nvidia.cn) + Torch-TensorRT 2.5

Tried compiling SAM2 but encountered errors.

Observed that the PyTorch 2.5 documentation does not mention SAM2 support, likely indicating SAM2 is not yet adapted for this version.

JetPack 6.1 + PyTorch 2.6 (from https://pypi.jetson-ai-lab.dev/jp6/cu126) + Torch-TensorRT 2.6

Installed PyTorch 2.6 from jp6/cu126 and Torch-TensorRT 2.6.

Importing torch_tensorrt failed with ModuleNotFoundError: No module named 'tensorrt.plugin'.

Root cause: Torch-TensorRT 2.6 requires TensorRT 10.7, but JetPack 6.1 provides only TensorRT 10.3.

Found no straightforward way to upgrade TensorRT within JetPack 6.1 due to dependency conflicts.

Cross-Platform Attempt: Compile on x86 + Run on JetPack 6.1

Compiled SAM2 on x86 with Torch-TensorRT 2.6 and exported the model.

Tried running it on JetPack 6.1 with Torch-TensorRT 2.5.

Failed unsurprisingly due to serialization version incompatibility between 2.6 and 2.5.

@AyanamiReiFan AyanamiReiFan added the question Further information is requested label Apr 17, 2025
@narendasan
Copy link
Collaborator

cc @peri044 @chohk88

@peri044
Copy link
Collaborator

peri044 commented Apr 22, 2025

@AyanamiReiFan I don't know of any workarounds for upgrading TRT 10.3 on Jetpack. That being said, you could give 25.03-py3-igpu a container a try. This container has TRT 10.9 and the corresponding Torch-TRT version. This might work although I haven't tested this yet. In the future, Jetpack 7 will have TRT 10.6+ which could also fix this issue.

@narendasan
Copy link
Collaborator

The iGPU container should also have a much more recent version of Torch-TRT

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants