-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add GPU Support for k8s-model-server on Kubeflow #194
Labels
Milestone
Comments
Cool! It's very helpful to speed up the inferencing and prediction work. |
It's @flx42 :) |
As part of this issue we should create an Argo workflow to build and test this Docker image as in #214. |
jlewi
pushed a commit
that referenced
this issue
Feb 16, 2018
…210) Related to #194 * Automated building and E2E testing will be added in a follow on PR. How to test You could train a mnist model with https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/mnist_saved_model.py. Then mount the mnist saved model into tensorflow model server container and launch the gRPC server with mnist model. Finally, you could launch a gRPC client with https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/mnist_client.py.
yanniszark
pushed a commit
to arrikto/kubeflow
that referenced
this issue
Feb 15, 2021
* add -o xtrace Signed-off-by: YujiOshima <yuji.oshima0x3fd@gmail.com> * use client-cert instead password Signed-off-by: YujiOshima <yuji.oshima0x3fd@gmail.com> * delete get-credentials Signed-off-by: YujiOshima <yuji.oshima0x3fd@gmail.com> * delete unnecessary line Signed-off-by: YujiOshima <yuji.oshima0x3fd@gmail.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
As a feature request, adding GPU support for TF-serving in Kubeflow can help speed up batch predictions that can take a long time to run on a CPU-only machine. Unfortunately, existing docker images, such as the one from the TF-serving repository, do not build correctly (or are not well-maintained): https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel-gpu
It would be good to have kubeflow maintain a working GPU docker image that can be deployed with the k8s-model-server.
The text was updated successfully, but these errors were encountered: