-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PyTorch execute failure: isTensor(); Expected Tensor but got GenericList #3348
Comments
While libtorch does support passing lists of tensors, Tritonserver does not. You can build a simple wrapper model that coverts a single tensor (or a multiple tensors) into a list of tensors and passes it on to your model. Once you have this wrapper model, simply trace the same and you should be able to use this model inside Triton. Refer to #2593 and #2373 (comment) |
I guess
like this
|
hi man, |
Description
I've built a custom Yolov5 model and traced it via a export script from UltraLytics. Testing it via their inference script and my Python native code works. Now I attempt to deploy it with Triton.
While deploying it Triton did not throw any error, it is when making a client call only I get this error.
A similar error can be found in #2594 , not sure why is this closed?? Perhaps @CoderHam can have a look at this issue? I'm passing a batched numpy array not tensor.
Triton Information
Triton docker version 21.03
Config file
Python Client
The text was updated successfully, but these errors were encountered: