-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] tritonserver not found in Merlin-inference 22.03 image #373
Comments
HI @IamGianluca looking into this right now |
Found the issue. Working on fixing this issue |
Thank you for looking into that @albert17 🙏 |
Created a PR: NVIDIA-Merlin/Merlin#135 |
@IamGianluca you can pull |
Good point @rnyak. I should probably use |
Hi @IamGianluc. Our inference container uses Triton inference server which can serve a pytorch model without requiring the library. We're going to release unified training and inference containers in the next release which should make this unnecessary. |
Hi,
It seems that
tritonserver
is not installed in the latest stable merlin-inference image (22.03).The text was updated successfully, but these errors were encountered: