Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GRPC API for javascript models with Nodejs s2i wrapper #224

Merged
merged 44 commits into from
Oct 1, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
6382455
GRPC implementation
Sep 17, 2018
9e8b655
updating the environment file for grpc
Sep 17, 2018
db42271
initial proxy commits
ukclivecox Sep 18, 2018
89bd230
update examples with GRPC API
Sep 19, 2018
a672cb3
initial tensorrt integration and wrapper for python
ukclivecox Sep 20, 2018
8d2977f
nvidia server example initial commits
ukclivecox Sep 22, 2018
eeeb829
Merge branch 'master' into proxies
ukclivecox Sep 22, 2018
4a69252
first running version nvidia mnist example
ukclivecox Sep 22, 2018
3540f5f
ensure single context for nvidia inference connection
ukclivecox Sep 22, 2018
b6cea76
update graph visualizer
ukclivecox Sep 22, 2018
14b713c
initial tfserving commit
ukclivecox Sep 22, 2018
4cac7a5
initial tfserving mnist example with helm notebook
ukclivecox Sep 23, 2018
f9039de
updates to tfserving to fix return shape
ukclivecox Sep 23, 2018
0eaae1a
Update kubectl_demo_minikube_rbac.ipynb
benoitbayol Sep 24, 2018
24b23ee
Merge pull request #230 from benoitbayol/patch-1
ukclivecox Sep 24, 2018
2d4c786
Update epsilon-greedy example to Python 3
jklaise Sep 24, 2018
eddeabf
Update kubectl_demo_minikube_rbac.ipynb
benoitbayol Sep 24, 2018
0d589a9
Merge pull request #232 from benoitbayol/patch-2
ukclivecox Sep 24, 2018
6d90eed
update nvidia example to use helm chart
ukclivecox Sep 24, 2018
76a603f
update notebook examples for tfserving and nvidia
ukclivecox Sep 24, 2018
e6eb4a5
fix for missing image-pull-policy in ksonnet
gsunner Sep 25, 2018
6d06eb1
fix for missing image-pull-policy in core.libsonnet
gsunner Sep 25, 2018
03e3581
Merge pull request #235 from gsunner/image-pull-policy-ksonnet-fix
gsunner Sep 25, 2018
2f7d724
Update docs on proxies
ukclivecox Sep 25, 2018
0ab7c1c
re-creating model server for new format
Sep 25, 2018
4123b9d
Adding transformer as a template
Sep 25, 2018
0558394
Adding SendFeedback API for models
Sep 25, 2018
b99f2bf
add missing files
ukclivecox Sep 26, 2018
df02555
Merge pull request #231 from jklaise/python3-routers
ukclivecox Sep 26, 2018
3ebb9f9
fix typos in READMEs
ukclivecox Sep 26, 2018
43ee776
Merge pull request #234 from cliveseldon/proxies
ukclivecox Sep 26, 2018
7e72abb
Don't change CR when defaulting just update status. Also some small f…
ukclivecox Sep 26, 2018
6eb9309
Merge pull request #238 from cliveseldon/defaulting
ukclivecox Sep 26, 2018
b70d984
GRPC implementation
Sep 17, 2018
3819585
updating the environment file for grpc
Sep 17, 2018
3a5b3e0
update examples with GRPC API
Sep 19, 2018
25f0ebb
re-creating model server for new format
Sep 25, 2018
651294a
Adding transformer as a template
Sep 25, 2018
dff29d5
Adding SendFeedback API for models
Sep 25, 2018
5421480
updated notebooks to use new wrapper
Sep 26, 2018
a392f8d
Merge branch 'issue-216' of https://github.com/SachinVarghese/seldon-…
Sep 26, 2018
e393b8d
update docs for grpc implementation
Sep 27, 2018
dfe6d66
Adding SIGTERM responses for the gRPC server
Sep 29, 2018
0ad9eaf
Adding SIGTERM responses for the REST server
Sep 29, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -269,6 +269,7 @@ public void createOrReplaceSeldonDeployment(SeldonDeployment mlDep) {
if (existing == null || !existing.getSpec().equals(mlDep.getSpec()))
{
logger.debug("Running updates for "+mlDep.getMetadata().getName());
SeldonDeployment mlDepStatusUpdated = operator.updateStatus(mlDep);
SeldonDeployment mlDep2 = operator.defaulting(mlDep);
operator.validate(mlDep2);
mlCache.put(mlDep2);
Expand All @@ -281,13 +282,13 @@ public void createOrReplaceSeldonDeployment(SeldonDeployment mlDep) {
//removeServices(client,namespace, mlDep2, resources.services); //Proto Client not presently working for deletion
ApiClient client2 = clientProvider.getClient();
removeServices(client2,namespace, mlDep2, resources.services);
if (!mlDep.getSpec().equals(mlDep2.getSpec()))
if (!mlDep.getSpec().equals(mlDepStatusUpdated.getSpec()))
{
logger.debug("Pushing updated SeldonDeployment "+mlDep2.getMetadata().getName()+" back to kubectl");
crdHandler.updateSeldonDeployment(mlDep2);
logger.debug("Pushing updated SeldonDeployment "+mlDepStatusUpdated.getMetadata().getName()+" back to kubectl");
crdHandler.updateSeldonDeployment(mlDepStatusUpdated);
}
else
logger.debug("Not pushing an update as no change to spec for SeldonDeployment "+mlDep2.getMetadata().getName());
logger.debug("Not pushing an update as no change to status for SeldonDeployment "+mlDep2.getMetadata().getName());
}
else
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@

public interface SeldonDeploymentOperator {

public SeldonDeployment updateStatus(SeldonDeployment mlDep);
public SeldonDeployment defaulting(SeldonDeployment mlDep);
public void validate(SeldonDeployment mlDep) throws SeldonDeploymentException;
public DeploymentResources createResources(SeldonDeployment mlDep) throws SeldonDeploymentException;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -356,7 +356,19 @@ public String getSeldonServiceName(SeldonDeployment dep,PredictorSpec pred,Strin
else
return svcName;
}


@Override
public SeldonDeployment updateStatus(SeldonDeployment mlDep) {
SeldonDeployment.Builder mlBuilder = SeldonDeployment.newBuilder(mlDep);

if (!mlBuilder.hasStatus())
{
mlBuilder.getStatusBuilder().setState(Constants.STATE_CREATING);
}

return mlBuilder.build();
}


@Override
public SeldonDeployment defaulting(SeldonDeployment mlDep) {
Expand All @@ -378,32 +390,34 @@ public SeldonDeployment defaulting(SeldonDeployment mlDep) {
for(int cIdx = 0;cIdx < spec.getSpec().getContainersCount();cIdx++)
{
V1.Container c = spec.getSpec().getContainers(cIdx);
String containerServiceKey = getPredictorServiceNameKey(c.getName());
String containerServiceValue = getSeldonServiceName(mlDep, p, c.getName());
metaBuilder.putLabels(containerServiceKey, containerServiceValue);

int portNum;
if (servicePortMap.containsKey(c.getName()))
portNum = servicePortMap.get(c.getName());
else
// Only update graph and container if container is referenced in the inference graph
V1.Container c2;
if(isContainerInGraph(p.getGraph(), c))
{
portNum = currentServicePortNum;
servicePortMap.put(c.getName(), portNum);
currentServicePortNum++;
String containerServiceKey = getPredictorServiceNameKey(c.getName());
String containerServiceValue = getSeldonServiceName(mlDep, p, c.getName());
metaBuilder.putLabels(containerServiceKey, containerServiceValue);

int portNum;
if (servicePortMap.containsKey(c.getName()))
portNum = servicePortMap.get(c.getName());
else
{
portNum = currentServicePortNum;
servicePortMap.put(c.getName(), portNum);
currentServicePortNum++;
}
c2 = this.updateContainer(c, findPredictiveUnitForContainer(mlDep.getSpec().getPredictors(pbIdx).getGraph(),c.getName()),portNum,deploymentName,predictorName);
updatePredictiveUnitBuilderByName(mlBuilder.getSpecBuilder().getPredictorsBuilder(pbIdx).getGraphBuilder(),c2,containerServiceValue);
}
V1.Container c2 = this.updateContainer(c, findPredictiveUnitForContainer(mlDep.getSpec().getPredictors(pbIdx).getGraph(),c.getName()),portNum,deploymentName,predictorName);
else
c2 = c;
mlBuilder.getSpecBuilder().getPredictorsBuilder(pbIdx).getComponentSpecsBuilder(ptsIdx).getSpecBuilder().addContainers(cIdx, c2);
updatePredictiveUnitBuilderByName(mlBuilder.getSpecBuilder().getPredictorsBuilder(pbIdx).getGraphBuilder(),c2,containerServiceValue);
}
mlBuilder.getSpecBuilder().getPredictorsBuilder(pbIdx).getComponentSpecsBuilder(ptsIdx).setMetadata(metaBuilder);
}
}

if (!mlBuilder.hasStatus())
{
mlBuilder.getStatusBuilder().setState(Constants.STATE_CREATING);
}

return mlBuilder.build();
}

Expand Down Expand Up @@ -504,6 +518,26 @@ private String getAmbassadorAnnotation(SeldonDeployment mlDep,String serviceName
return restMapping + grpcMapping;
}

/**
*
* @param pu - A predictiveUnit
* @param container - a container
* @return True if container name can be found in graph of pu
*/
private boolean isContainerInGraph(PredictiveUnit pu,V1.Container container)
{
if (pu.getName().equals(container.getName()))
{
return true;
}
else
{
for(int i=0;i<pu.getChildrenCount();i++)
if (isContainerInGraph(pu.getChildren(i),container))
return true;
}
return false;
}

private void addServicePort(PredictiveUnit pu,String serviceName,ServiceSpec.Builder svcSpecBuilder)
{
Expand Down Expand Up @@ -651,7 +685,8 @@ public DeploymentResources createResources(SeldonDeployment mlDep) throws Seldon
final String containerServiceKey = getPredictorServiceNameKey(c.getName());
final String containerServiceValue = getSeldonServiceName(mlDep, p, c.getName());

if (!createdServices.contains(containerServiceValue))
// Only add a Service if container is a Seldon component in graph and we haven't already created a service for this container name
if (isContainerInGraph(p.getGraph(), c) && !createdServices.contains(containerServiceValue))
{
//Add service
Service.Builder s = Service.newBuilder()
Expand Down Expand Up @@ -744,5 +779,6 @@ public DeploymentResources(List<Deployment> deployments, List<Service> services)
}




}
3 changes: 2 additions & 1 deletion docs/wrappers/nodejs.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,13 +136,14 @@ The name of the JS file containing the model.

### API_TYPE

API type to create. Can be REST only at present.
API type to create. Can be REST or GRPC.

### SERVICE_TYPE

The service type being created. Available options are:

- MODEL
- TRANSFORMER

### PERSISTENCE

Expand Down
4 changes: 4 additions & 0 deletions examples/models/nodejs_mnist/.s2i/environment_grpc
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
MODEL_NAME=MnistClassifier.js
API_TYPE=GRPC
SERVICE_TYPE=MODEL
PERSISTENCE=0
63 changes: 61 additions & 2 deletions examples/models/nodejs_mnist/nodejs_mnist.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
"metadata": {},
"outputs": [],
"source": [
"!s2i build . seldonio/seldon-core-s2i-nodejs:0.1 node-s2i-mnist-model:0.1"
"!s2i build . seldonio/seldon-core-s2i-nodejs:0.2-SNAPSHOT node-s2i-mnist-model:0.1"
]
},
{
Expand Down Expand Up @@ -97,6 +97,65 @@
"!docker rm nodejs_mnist_predictor --force"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prediction using GRPC API on the docker container"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!s2i build -E ./.s2i/environment_grpc . seldonio/seldon-core-s2i-nodejs:0.2-SNAPSHOT node-s2i-mnist-model:0.2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!docker run --name \"nodejs_mnist_predictor\" -d --rm -p 5000:5000 node-s2i-mnist-model:0.2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!cd ../../../wrappers/testing && make build_protos"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Send some random features that conform to the contract"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!python ../../../wrappers/testing/tester.py contract.json 0.0.0.0 5000 -p -t --grpc"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!docker rm nodejs_mnist_predictor --force"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -147,7 +206,7 @@
"metadata": {},
"outputs": [],
"source": [
"!eval $(minikube docker-env) && s2i build . seldonio/seldon-core-s2i-nodejs:0.1 node-s2i-mnist-model:0.1"
"!eval $(minikube docker-env) && s2i build . seldonio/seldon-core-s2i-nodejs:0.2-SNAPSHOT node-s2i-mnist-model:0.1"
]
},
{
Expand Down
4 changes: 4 additions & 0 deletions examples/models/nodejs_tensorflow/.s2i/environment_grpc
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
MODEL_NAME=MyModel.js
API_TYPE=GRPC
SERVICE_TYPE=MODEL
PERSISTENCE=0
70 changes: 68 additions & 2 deletions examples/models/nodejs_tensorflow/nodejs_tensorflow.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@
"metadata": {},
"outputs": [],
"source": [
"!s2i build . seldonio/seldon-core-s2i-nodejs:0.1 node-s2i-model-image:0.1"
"!s2i build . seldonio/seldon-core-s2i-nodejs:0.2-SNAPSHOT node-s2i-model-image:0.1"
]
},
{
Expand Down Expand Up @@ -96,6 +96,72 @@
"!docker rm nodejs_tensorflow_predictor --force"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prediction using GRPC API on the docker container"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!s2i build -E ./.s2i/environment_grpc . seldonio/seldon-core-s2i-nodejs:0.2-SNAPSHOT node-s2i-model-image:0.2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!docker run --name \"nodejs_tensorflow_predictor\" -d --rm -p 5000:5000 node-s2i-model-image:0.2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!cd ../../../wrappers/testing && make build_protos"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Send some random features that conform to the contract"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!python ../../../wrappers/testing/tester.py contract.json 0.0.0.0 5000 -p -t --grpc"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!docker rm nodejs_tensorflow_predictor --force"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prediction using Minikube"
]
},
{
"cell_type": "code",
"execution_count": null,
Expand Down Expand Up @@ -139,7 +205,7 @@
"metadata": {},
"outputs": [],
"source": [
"!eval $(minikube docker-env) && s2i build . seldonio/seldon-core-s2i-nodejs:0.1 node-s2i-model-image:0.1"
"!eval $(minikube docker-env) && s2i build . seldonio/seldon-core-s2i-nodejs:0.2-SNAPSHOT node-s2i-model-image:0.1"
]
},
{
Expand Down
4 changes: 4 additions & 0 deletions examples/models/nvidia-mnist/.s2i/environment
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
MODEL_NAME=MnistTransformer
API_TYPE=REST
SERVICE_TYPE=TRANSFORMER
PERSISTENCE=0
14 changes: 14 additions & 0 deletions examples/models/nvidia-mnist/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
TRANSFORMER_IMAGE=seldonio/mnist-caffe2-transformer:0.1

clean:
rm -f rm -f tensorrt_mnist/1/model.plan
rm -rf MNIST_data
rm -f mnist.json
rm -f tmp.json

build_transformer:
s2i build . seldonio/seldon-core-s2i-python3:0.2 ${TRANSFORMER_IMAGE}

push_transformer:
docker push ${TRANSFORMER_IMAGE}

Loading