-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG and QUESTION] Merlin inference container 22.03 from NGC not working #142
Comments
@leiterenato thanks for reporting the issue. we are currently looking into that issue. @albert17 fyi. |
Hi @leiterenato This issue has been identified. Working on it: #135 Will provide an updated nightly container today or tomorrow with latest changes included. Sorry for the problems. |
Thank you! |
@albert17 , will the linked PR release the inference containers with tritonclient ? |
Nightly container updated. Please try @leiterenato
|
Thanks @albert17. |
@albert17 Is there a way to export a variable in my Dockerfile to solve this? |
Hi @albert17,
Could you please verify? Thank you |
@leiterenato I have updated dockerfile and containers. Try again with This should not happen anymorre |
|
BUG
There is no
tritonserver
installed on this container (merlin-inference:22.03): https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-inference.To reproduce this error, just start a new merlin-inference container (22.03) and try to invoke the tritonserver.
How should I invoke
tritonserver
with this new container?QUESTION
I am trying to load an ensemble model trained with HugeCTR to triton.
I am using the following script to start the server with the ensemble:
Am I starting the container correctly?
I really appreciate any help.
The text was updated successfully, but these errors were encountered: