diff --git a/docs/wrappers/python.md b/docs/wrappers/python.md index 51e79d3355..8ebc021f49 100644 --- a/docs/wrappers/python.md +++ b/docs/wrappers/python.md @@ -14,6 +14,13 @@ If you are not familar with s2i you can read [general instructions on using s2i] * Docker * Git (if building from a remote git repo) +To check everything is working you can run + +```bash +s2i usage seldonio/seldon-core-s2i-python3 +``` + + # Step 2 - Create your source code To use our s2i builder image to package your python model you will need: diff --git a/docs/wrappers/r.md b/docs/wrappers/r.md new file mode 100644 index 0000000000..c4db0f4c4a --- /dev/null +++ b/docs/wrappers/r.md @@ -0,0 +1,163 @@ +# Packaging an R model for Seldon Core using s2i + + +In this guide, we illustrate the steps needed to wrap your own R model in a docker image ready for deployment with Seldon Core using [source-to-image app s2i](https://github.com/openshift/source-to-image). + +If you are not familar with s2i you can read [general instructions on using s2i](./s2i.md) and then follow the steps below. + + +# Step 1 - Install s2i + + [Download and install s2i](https://github.com/openshift/source-to-image#installation) + + * Prequisites for using s2i are: + * Docker + * Git (if building from a remote git repo) + +To check everything is working you can run + +```bash +s2i usage seldonio/seldon-core-s2i-r +``` + +# Step 2 - Create your source code + +To use our s2i builder image to package your R model you will need: + + * An R file which provides an S3 class for your model via an ```initialise_seldon``` function and that has appropriate generics for your component, e.g. predict for a model. + * An optional install.R to be run to install any libraries needed + * .s2i/environment - model definitions used by the s2i builder to correctly wrap your model + +We will go into detail for each of these steps: + +## R Runtime Model file +Your source code should contain an R file which defines an S3 class for your model. For example, looking at our skeleton R model file at ```wrappers/s2i/R/test/model-template-app/MyModel.R```: + +```R +library(methods) + +predict.mymodel <- function(mymodel,newdata=list()) { + write("MyModel predict called", stdout()) + newdata +} + + +new_mymodel <- function() { + structure(list(), class = "mymodel") +} + + +initialise_seldon <- function(params) { + new_mymodel() +} +``` + + * A ```seldon_initialise``` function creates an S3 class for my model via a constructor ```new_mymodel```. This will be called on startup and you can use this to load any parameters your model needs. + * A generic ```predict``` function is created for my model class. This will be called with a ```newdata``` field with the ```data.frame``` to be predicted. + +There are similar templates for ROUTERS and TRANSFORMERS. + + +## install.R +Populate an ```install.R``` with any software dependencies your code requires. For example: + +```R +install.packages('rpart') +``` + +## .s2i/environment + +Define the core parameters needed by our R builder image to wrap your model. An example is: + +```bash +MODEL_NAME=MyModel +API_TYPE=REST +SERVICE_TYPE=MODEL +PERSISTENCE=0 +``` + +These values can also be provided or overriden on the command line when building the image. + +# Step 3 - Build your image +Use ```s2i build``` to create your Docker image from source code. You will need Docker installed on the machine and optionally git if your source code is in a public git repo. + +Using s2i you can build directly from a git repo or from a local source folder. See the [s2i docs](https://github.com/openshift/source-to-image/blob/master/docs/cli.md#s2i-build) for further details. The general format is: + +```bash +s2i build seldonio/seldon-core-s2i-r +s2i build seldonio/seldon-core-s2i-r +``` + +An example invocation using the test template model inside seldon-core: + +```bash +s2i build https://github.com/seldonio/seldon-core.git --context-dir=wrappers/s2i/R/test/model-template-app seldonio/seldon-core-s2i-r seldon-core-template-model +``` + +The above s2i build invocation: + + * uses the GitHub repo: https://github.com/seldonio/seldon-core.git and the directory ```wrappers/s2i/R/test/model-template-app``` inside that repo. + * uses the builder image ```seldonio/seldon-core-s2i-r``` + * creates a docker image ```seldon-core-template-model``` + + +For building from a local source folder, an example where we clone the seldon-core repo: + +```bash +git clone https://github.com/seldonio/seldon-core.git +cd seldon-core +s2i build wrappers/s2i/R/test/model-template-app seldonio/seldon-core-s2i-r seldon-core-template-model +``` + +For more help see: + +``` +s2i usage seldonio/seldon-core-s2i-r +s2i build --help +``` + +# Reference + +## Environment Variables +The required environment variables understood by the builder image are explained below. You can provide them in the ```.s2i/enviroment``` file or on the ```s2i build``` command line. + + +### MODEL_NAME +The name of the R file containing the model. + +### API_TYPE + +API type to create. Can be REST only at present. + +### SERVICE_TYPE + +The service type being created. Available options are: + + * MODEL + * ROUTER + * TRANSFORMER + +### PERSISTENCE + +Can only by 0 at present. In future, will allow the state of the component to be saved periodically. + + +## Creating different service types + +### MODEL + + * [A minimal skeleton for model source code](https://github.com/cliveseldon/seldon-core/tree/s2i/wrappers/s2i/R/test/model-template-app) + * [Example models](https://github.com/SeldonIO/seldon-core/tree/master/examples/models) + +### ROUTER + + * [A minimal skeleton for router source code](https://github.com/cliveseldon/seldon-core/tree/s2i/wrappers/s2i/R/test/router-template-app) + +### TRANSFORMER + + * [A minimal skeleton for transformer source code](https://github.com/cliveseldon/seldon-core/tree/s2i/wrappers/s2i/R/test/transformer-template-app) + + + + + diff --git a/docs/wrappers/readme.md b/docs/wrappers/readme.md index 089790242f..73c4104947 100644 --- a/docs/wrappers/readme.md +++ b/docs/wrappers/readme.md @@ -16,14 +16,12 @@ You can use either: * [Source-to-image (s2i) tool](./python.md) * [Seldon Docker wrapper application](./python-docker.md) -## H2O - * [H2O models](./h2o.md) +## R -## Future + * [R models can be wrapped using source-to-image](r.md) -Future languages: +## H2O + + * [H2O models](./h2o.md) - * R based models - * Java based models - * Go based models diff --git a/docs/wrappers/s2i.md b/docs/wrappers/s2i.md index 1035b2eefd..4fc91876b9 100644 --- a/docs/wrappers/s2i.md +++ b/docs/wrappers/s2i.md @@ -17,8 +17,4 @@ The general work flow is: At present we have s2i builder images for * [python (python2 or python3)](./python.md) : use this for Tensorflow, Keras, pyTorch or sklearn models. - -We plan on also supporting other base languages such as: - - * R - * Java + * [R](r.md) diff --git a/examples/models/r_iris/.s2i/environment b/examples/models/r_iris/.s2i/environment new file mode 100644 index 0000000000..1f25424d40 --- /dev/null +++ b/examples/models/r_iris/.s2i/environment @@ -0,0 +1,4 @@ +MODEL_NAME=iris.R +API_TYPE=REST +SERVICE_TYPE=MODEL +PERSISTENCE=0 diff --git a/examples/models/r_iris/README.md b/examples/models/r_iris/README.md new file mode 100644 index 0000000000..4ac358bfaa --- /dev/null +++ b/examples/models/r_iris/README.md @@ -0,0 +1,83 @@ +# Deep MNIST +An R Iris model. + +## Depenencies + +R + +## Train locally + +```bash +Rscript train.R +``` + +## Wrap using [s2i](https://github.com/openshift/source-to-image#installation). + +```bash +s2i build . seldonio/seldon-core-s2i-r r-iris:0.1 +``` + +## Local Docker Smoke Test + +Run under docker. + +```bash +docker run --rm -p 5000:5000 r-iris:0.1 +``` + +Ensure test grpc modules compiled. + +```bash +pushd ../../../wrappers/testing ; make build_protos ; popd +``` + +Send a data request using the wrapper tester. + +```bash +python ../../../wrappers/testing/tester.py contract.json 0.0.0.0 5000 -p +``` + +## Minikube test + +```bash +minikube start --memory 4096 +``` + +[Install seldon core](/readme.md#install) + +Connect to Minikube Docker daemon + +```bash +eval $(minikube docker-env) +``` + +Build image using minikube docker daemon. + +```bash +s2i build . seldonio/seldon-core-s2i-r r-iris:0.1 +``` + +Launch deployment + +```bash +kubectl create -f r_iris_deployment.json +``` + +Port forward API server + +```bash +kubectl port-forward $(kubectl get pods -n seldon -l app=seldon-apiserver-container-app -o jsonpath='{.items[0].metadata.name}') -n seldon 8080:8080 +``` + +Ensure tester gRPC modules compiled. + +```bash +pushd ../../../util/api_tester ; make build_protos ; popd +``` + +Send test request +```bash +python ../../../util/api_tester/api-tester.py contract.json 0.0.0.0 8080 --oauth-key oauth-key --oauth-secret oauth-secret -p +``` + + diff --git a/examples/models/r_iris/contract.json b/examples/models/r_iris/contract.json new file mode 100644 index 0000000000..ab7102d7e5 --- /dev/null +++ b/examples/models/r_iris/contract.json @@ -0,0 +1,39 @@ +{ + "features":[ + { + "name":"sepal_length", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[4,8] + }, + { + "name":"sepal_width", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[2,5] + }, + { + "name":"petal_length", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[1,10] + }, + { + "name":"petal_width", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[0,3] + } + ], + "targets":[ + { + "name":"class", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[0,1], + "repeat":3 + } + ] +} + + diff --git a/examples/models/r_iris/install.R b/examples/models/r_iris/install.R new file mode 100644 index 0000000000..f697261127 --- /dev/null +++ b/examples/models/r_iris/install.R @@ -0,0 +1 @@ +install.packages('rpart') diff --git a/examples/models/r_iris/iris.R b/examples/models/r_iris/iris.R new file mode 100644 index 0000000000..90230af088 --- /dev/null +++ b/examples/models/r_iris/iris.R @@ -0,0 +1,14 @@ +library(methods) + +predict.iris <- function(iris,newdata=list()) { + predict(iris$model, newdata = newdata) +} + +new_iris <- function(filename) { + model <- readRDS(filename) + structure(list(model=model), class = "iris") +} + +initialise_seldon <- function(params) { + new_iris("model.Rds") +} \ No newline at end of file diff --git a/examples/models/r_iris/r_iris_deployment.json b/examples/models/r_iris/r_iris_deployment.json new file mode 100644 index 0000000000..e4183bce90 --- /dev/null +++ b/examples/models/r_iris/r_iris_deployment.json @@ -0,0 +1,53 @@ +{ + "apiVersion": "machinelearning.seldon.io/v1alpha1", + "kind": "SeldonDeployment", + "metadata": { + "labels": { + "app": "seldon" + }, + "name": "seldon-deployment-example" + }, + "spec": { + "annotations": { + "project_name": "Iris classification", + "deployment_version": "0.1" + }, + "name": "r-iris-deployment", + "oauth_key": "oauth-key", + "oauth_secret": "oauth-secret", + "predictors": [ + { + "componentSpec": { + "spec": { + "containers": [ + { + "image": "r-iris:0.1", + "imagePullPolicy": "IfNotPresent", + "name": "r-iris-classifier", + "resources": { + "requests": { + "memory": "1Mi" + } + } + } + ], + "terminationGracePeriodSeconds": 20 + } + }, + "graph": { + "children": [], + "name": "r-iris-classifier", + "endpoint": { + "type" : "REST" + }, + "type": "MODEL" + }, + "name": "r-iris-predictor", + "replicas": 1, + "annotations": { + "predictor_version" : "0.1" + } + } + ] + } +} diff --git a/examples/models/r_iris/train.R b/examples/models/r_iris/train.R new file mode 100644 index 0000000000..b0740cda0f --- /dev/null +++ b/examples/models/r_iris/train.R @@ -0,0 +1,6 @@ +library(rpart) + +data(iris) +names(iris) <- tolower(sub('.', '_', names(iris), fixed = TRUE)) +fit <- rpart(species ~ ., iris) +saveRDS(fit, file = "model.Rds", compress = TRUE) diff --git a/examples/models/sklearn_iris/contract.json b/examples/models/sklearn_iris/contract.json index 5d80dcbf73..ab7102d7e5 100644 --- a/examples/models/sklearn_iris/contract.json +++ b/examples/models/sklearn_iris/contract.json @@ -1,11 +1,28 @@ { "features":[ { - "name":"x", + "name":"sepal_length", "dtype":"FLOAT", "ftype":"continuous", - "range":[0,1], - "repeat":4 + "range":[4,8] + }, + { + "name":"sepal_width", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[2,5] + }, + { + "name":"petal_length", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[1,10] + }, + { + "name":"petal_width", + "dtype":"FLOAT", + "ftype":"continuous", + "range":[0,3] } ], "targets":[ diff --git a/util/api_tester/Makefile b/util/api_tester/Makefile index 711d2ab15b..bbc113a21e 100644 --- a/util/api_tester/Makefile +++ b/util/api_tester/Makefile @@ -1,5 +1,7 @@ .PHONY: update_protos update_protos: + mkdir -p ./proto + touch ./proto/__init__.py cp ../../proto/prediction.proto ./proto .PHONY: build_protos diff --git a/wrappers/s2i/R/Dockerfile b/wrappers/s2i/R/Dockerfile new file mode 100644 index 0000000000..a579738ef4 --- /dev/null +++ b/wrappers/s2i/R/Dockerfile @@ -0,0 +1,23 @@ +FROM rocker/r-base + +LABEL io.openshift.s2i.scripts-url="image:///s2i/bin" + +RUN apt-get update -qq && apt-get install -y \ + git-core \ + libssl-dev \ + libcurl4-gnutls-dev + +RUN Rscript -e "install.packages('devtools')" +RUN Rscript -e "install.packages('plumber')" +RUN Rscript -e "install.packages('optparse')" +RUN Rscript -e "install.packages('jsonlite')" +RUN Rscript -e "install.packages('urltools')" + +RUN mkdir microservice +WORKDIR /microservice + +COPY microservice.R /microservice + +COPY ./s2i/bin/ /s2i/bin + +EXPOSE 5000 diff --git a/wrappers/s2i/R/Makefile b/wrappers/s2i/R/Makefile new file mode 100644 index 0000000000..56111e4aff --- /dev/null +++ b/wrappers/s2i/R/Makefile @@ -0,0 +1,18 @@ +IMAGE_NAME = docker.io/seldonio/seldon-core-s2i-r + +SELDON_CORE_DIR=../../.. + + +.PHONY: build +build: + docker build -t $(IMAGE_NAME) . + +push_to_dockerhub: + docker push $(IMAGE_NAME):latest + + +.PHONY: test +test: + docker build -t $(IMAGE_NAME)-candidate . + IMAGE_NAME=$(IMAGE_NAME)-candidate test/run + diff --git a/wrappers/s2i/R/microservice.R b/wrappers/s2i/R/microservice.R new file mode 100644 index 0000000000..3a946047c3 --- /dev/null +++ b/wrappers/s2i/R/microservice.R @@ -0,0 +1,331 @@ +library(plumber) +library(jsonlite) +library(optparse) +library(methods) +library(urltools) +library(stringi) + +parseQS <- function(qs){ + if (is.null(qs) || length(qs) == 0 || qs == "") { + return(list()) + } + if (stri_startswith_fixed(qs, "?")) { + qs <- substr(qs, 2, nchar(qs)) + } + + parts <- strsplit(qs, "&", fixed = TRUE)[[1]] + kv <- strsplit(parts, "=", fixed = TRUE) + kv <- kv[sapply(kv, length) == 2] # Ignore incompletes + + keys <- sapply(kv, "[[", 1) + keys <- unname(sapply(keys, url_decode)) + + vals <- sapply(kv, "[[", 2) + vals[is.na(vals)] <- "" + vals <- unname(sapply(vals, url_decode)) + + ret <- as.list(vals) + names(ret) <- keys + + # If duplicates, combine + combine_elements <- function(name){ + unname(unlist(ret[names(ret)==name])) + } + + unique_names <- unique(names(ret)) + + ret <- lapply(unique_names, combine_elements) + names(ret) <- unique_names + + ret +} + + +v <- function(...) cat(sprintf(...), sep='', file=stdout()) + +validate_json <- function(jdf) { + if (!"data" %in% names(jdf)) { + return("data field is missing") + } + else if (!("ndarray" %in% names(jdf$data) || "tensor" %in% names(jdf$data)) ) { + return("data field must contain ndarray or tensor field") + } + else{ + return("OK") + } +} + +validate_feedback <- function(jdf) { + if (!"request" %in% names(jdf)) + { + return("request field is missing") + } + else if (!"reward" %in% names(jdf)) + { + return("reward field is missing") + } + else if (!"data" %in% names(jdf$request)) { + return("data request field is missing") + } + else if (!("ndarray" %in% names(jdf$request$data) || "tensor" %in% names(jdf$request$data)) ) { + return("data field must contain ndarray or tensor field") + } + else{ + return("OK") + } +} + +extract_data <- function(jdf) { + if ("ndarray" %in% names(jdf$data)){ + jdf$data$ndarray + } else { + data <- jdf$data$tensor$values + dim(data) <- jdf$data$tensor$shape + data + } +} + +extract_names <- function(jdf) { + if ("names" %in% names(jdf$data)) { + jdf$data$names + } else { + list() + } +} + +create_response <- function(req_df,res_df){ + if ("ndarray" %in% names(req_df$data)){ + templ <- '{"data":{"names":%s,"ndarray":%s}}' + names <- toJSON(colnames(res_df)) + values <- toJSON(as.matrix(res_df)) + sprintf(templ,names,values) + } else { + templ <- '{"data":{"names":%s,"tensor":{"shape":%s,"values":%s}}}' + names <- toJSON(colnames(res_df)) + values <- toJSON(c(res_df)) + dims <- toJSON(dim(res_df)) + sprintf(templ,names,dims,values) + } +} + +create_dataframe <- function(jdf) { + data = extract_data(jdf) + names = extract_names(jdf) + df <- data.frame(data) + colnames(df) <- names + df +} + +# See https://github.com/trestletech/plumber/issues/105 +parse_data <- function(req){ + parsed <- parseQS(req$postBody) + if (is.null(parsed$json)) + { + parsed <- parseQS(req$QUERY_STRING) + } + parsed$json +} + +predict_endpoint <- function(req,res,json=NULL,isDefault=NULL) { + #for ( obj in ls(req) ) { + # print(c(obj,get(obj,envir = req))) + #} + json <- parse_data(req) # Hack as Plumber using URLDecode which doesn't decode + + jdf <- fromJSON(json) + valid_input <- validate_json(jdf) + if (valid_input[1] == "OK") { + df <- create_dataframe(jdf) + scores <- predict(user_model,newdata=df) + res_json = create_response(jdf,scores) + res$body <- res_json + res + } else { + res$status <- 400 # Bad request + list(error=jsonlite::unbox(valid_input)) + } +} + +send_feedback_endpoint <- function(req,res,json=NULL,isDefault=NULL) { + json <- parse_data(req) + jdf <- fromJSON(json) + valid_input <- validate_feedback(jdf) + if (valid_input[1] == "OK") { + request <- create_dataframe(jdf$request) + if ("truth" %in% names(jdf)){ + truth <- create_dataframe(jdf$truth) + } else { + truth <- NULL + } + #reward <- jdf$reward + send_feedback(user_model,request=request,reward=1,truth=truth) + res$body <- "{}" + res + } else { + res$status <- 400 # Bad request + list(error=jsonlite::unbox(valid_input)) + } +} + + +transform_input_endpoint <- function(req,res,json=NULL,isDefault=NULL) { + json <- parse_data(req) + jdf <- fromJSON(json) + valid_input <- validate_json(jdf) + if (valid_input[1] == "OK") { + df <- create_dataframe(jdf) + trans <- transform_input(user_model,newdata=df) + res_json = create_response(jdf,trans) + res$body <- res_json + res + } else { + res$status <- 400 # Bad request + list(error=jsonlite::unbox(valid_input)) + } +} + +transform_output_endpoint <- function(req,res,json=NULL,isDefault=NULL) { + json <- parse_data(req) + jdf <- fromJSON(json) + valid_input <- validate_json(jdf) + if (valid_input[1] == "OK") { + df <- create_dataframe(jdf) + trans <- transform_output(user_model,newdata=df) + res_json = create_response(jdf,trans) + res$body <- res_json + res + } else { + res$status <- 400 # Bad request + list(error=jsonlite::unbox(valid_input)) + } +} + +route_endpoint <- function(req,res,json=NULL,isDefault=NULL) { + json <- parse_data(req) + jdf <- fromJSON(json) + valid_input <- validate_json(jdf) + if (valid_input[1] == "OK") { + df <- create_dataframe(jdf) + routing <- route(user_model,data=df) + res_json = create_response(jdf,data.frame(list(routing))) + res$body <- res_json + res + } else { + res$status <- 400 # Bad request + list(error=jsonlite::unbox(valid_input)) + } +} + +parse_commandline <- function() { + parser <- OptionParser() + parser <- add_option(parser, c("-p", "--parameters"), type="character", + help="Parameters for component", metavar = "parameters") + parser <- add_option(parser, c("-m", "--model"), type="character", + help="Model file", metavar = "model") + parser <- add_option(parser, c("-s", "--service"), type="character", + help="Service type", metavar = "service", default = "MODEL") + parser <- add_option(parser, c("-a", "--api"), type="character", + help="API type - REST", metavar = "api", default = "REST") + parser <- add_option(parser, c("-e", "--persistence"), type="integer", + help="Persistence", metavar = "persistence", default = 0) + args <- parse_args(parser, args = commandArgs(trailingOnly = TRUE), + convert_hyphens_to_underscores = TRUE) + + if (is.null(args$parameters)){ + args$parameters <- Sys.getenv("PREDICTIVE_UNIT_PARAMETERS") + } + + if (args$parameters == ''){ + args$parameters = "[]" + } + + args +} + + +extract_parmeters <- function(params) { + j = fromJSON(params) + values <- list() + names <- list() + for (i in seq_along(j)) + { + name <- j[i,"name"] + value <- j[i,"value"] + type <- j[i,"type"] + if (type == "INT") + value <- as.integer(value) + else if (type == "FLOAT") + value <- as.double(value) + else if (type == "BOOL") + value <- as.logical(type.convert(value)) + values <- c(values,value) + names <- c(names,name) + } + names(values) <- names + values +} + +validate_commandline <- function(args) { + if (!is.element(args$service,c("MODEL","ROUTER","COMBINER","TRANSFORMER"))) { + v("Invalid service type [%s]\n",args$service) + 1 + }else if (!is.element(args$api,c("REST"))) { + v("Invalid API type [%s]\n",args$api) + 1 + } + else{ + 0 + } +} + +# Parse command line and validate +args <- parse_commandline() +if (validate_commandline(args) > 0){ + quit(status=1) +} +params <- extract_parmeters(args$parameters) + +# Check user model exists +if(!file.exists(args$model)){ + v("Model file does not exist [%s]\n",args$model) + quit(status=1) +} + +#Load user model +source(args$model) +user_model <- initialise_seldon(params) + +# Setup generics +# Predict already exists in base R +send_feedback <- function(x,...) UseMethod("send_feedback", x) +route <- function(x,...) UseMethod("route",x) +transform_input <- function(x,...) UseMethod("transform_input",x) +transform_output <- function(x,...) UseMethod("transform_output",x) + +serve_model <- plumber$new() +if (args$service == "MODEL") { + serve_model$handle("POST", "/predict",predict_endpoint) + serve_model$handle("GET", "/predict",predict_endpoint) +} else if (args$service == "ROUTER") { + serve_model$handle("POST", "/route",route_endpoint) + serve_model$handle("GET", "/route",route_endpoint) + serve_model$handle("POST", "/send-feedback",send_feedback_endpoint) + serve_model$handle("GET", "/send-feedback",send_feedback_endpoint) +} else if (args$service == "TRANSFORMER") { + serve_model$handle("POST", "/transform-output",transform_output_endpoint) + serve_model$handle("GET", "/transform-output",transform_output_endpoint) + serve_model$handle("POST", "/transform-input",transform_input_endpoint) + serve_model$handle("GET", "/transform-input",transform_input_endpoint) + +} else +{ + v("Unknown service type [%s]\n",args$service) + quit(status=1) +} + +port <- Sys.getenv("PREDICTIVE_UNIT_SERVICE_PORT") +if (port == ''){ + port <- 5000 +} else { + port <- as.integer(port) +} +serve_model$run(host="0.0.0.0", port = port) diff --git a/wrappers/s2i/R/s2i/bin/assemble b/wrappers/s2i/R/s2i/bin/assemble new file mode 100755 index 0000000000..89289ad0d8 --- /dev/null +++ b/wrappers/s2i/R/s2i/bin/assemble @@ -0,0 +1,57 @@ +#!/bin/bash -e +# +# S2I assemble script for the 'seldon-core-s2i-python' image. +# The 'assemble' script builds your application source so that it is ready to run. +# +# For more information refer to the documentation: +# https://github.com/openshift/source-to-image/blob/master/docs/builder_image.md +# + +# If the 'seldon-core-s2i-r' assemble script is executed with the '-h' flag, print the usage. +if [[ "$1" == "-h" ]]; then + exec /usr/libexec/s2i/usage +fi + + +if [[ -z "$MODEL_NAME" ]]; then + + echo "Failed to find required env var MODEL_NAME" + exit 1 +fi + +if [[ -z "$API_TYPE" ]]; then + + echo "Failed to find required env var API_TYPE, should be either REST or GRPC." + exit 1 +fi + +if [[ -z "$SERVICE_TYPE" ]]; then + + echo "Failed to find required env var SERVICE_TYPE, should be one of MODEL, ROUTER, TRANSFORMER, COMBINER." + exit 1 +fi + +if [[ -z "$PERSISTENCE" ]]; then + + echo "Failed to find required env var PERSISTENCE, should be 0 or 1." + exit 1 +fi + + +cd /microservice + +# Restore artifacts from the previous build (if they exist). +# +if [ "$(ls /tmp/artifacts/ 2>/dev/null)" ]; then + echo "---> Restoring build artifacts..." + mv /tmp/artifacts/. ./ +fi + +echo "---> Installing application source..." +cp -Rf /tmp/src/. ./ + +if [[ -f install.R ]]; then + echo "---> Installing dependencies ..." + Rscript install.R +fi + diff --git a/wrappers/s2i/R/s2i/bin/run b/wrappers/s2i/R/s2i/bin/run new file mode 100755 index 0000000000..8c3d01b8bb --- /dev/null +++ b/wrappers/s2i/R/s2i/bin/run @@ -0,0 +1,22 @@ +#!/bin/bash -e +# +# S2I run script for the 'seldon-core-s2i-python' image. +# The run script executes the server that runs your application. +# +# For more information see the documentation: +# https://github.com/openshift/source-to-image/blob/master/docs/builder_image.md +# + +#check environment vars +if [[ -z "$MODEL_NAME" || -z "$API_TYPE" || -z "$SERVICE_TYPE" || -z "$PERSISTENCE" ]]; then + + echo "Failed to find required env vars MODEL_NAME, API_TYPE, SERVICE_TYPE, PERSISTENCE" + exit 1 + +else + cd /microservice + echo "starting microservice" + exec Rscript microservice.R --model $MODEL_NAME --api $API_TYPE --service $SERVICE_TYPE --persistence $PERSISTENCE + +fi + diff --git a/wrappers/s2i/R/s2i/bin/save-artifacts b/wrappers/s2i/R/s2i/bin/save-artifacts new file mode 100755 index 0000000000..cd931a2678 --- /dev/null +++ b/wrappers/s2i/R/s2i/bin/save-artifacts @@ -0,0 +1,10 @@ +#!/bin/sh -e +# +# S2I save-artifacts script for the 'seldon-core-s2i-python' image. +# The save-artifacts script streams a tar archive to standard output. +# The archive contains the files and folders you want to re-use in the next build. +# +# For more information see the documentation: +# https://github.com/openshift/source-to-image/blob/master/docs/builder_image.md +# +# tar cf - diff --git a/wrappers/s2i/R/s2i/bin/usage b/wrappers/s2i/R/s2i/bin/usage new file mode 100755 index 0000000000..8d48454196 --- /dev/null +++ b/wrappers/s2i/R/s2i/bin/usage @@ -0,0 +1,46 @@ +#!/bin/bash -e +cat <"/dev/null" || READLINK_EXEC="greadlink" + ! type -a "gmktemp" &>"/dev/null" || MKTEMP_EXEC="gmktemp" +fi + +test_dir="$($READLINK_EXEC -zf $(dirname "${BASH_SOURCE[0]}"))" +image_dir=$($READLINK_EXEC -zf ${test_dir}/..) + + +# Since we built the candidate image locally, we don't want S2I to attempt to pull +# it from Docker hub +s2i_args="--pull-policy=never --loglevel=2" + +# Port the image exposes service to be tested +test_port=5000 + +image_exists() { + docker inspect $1 &>/dev/null +} + +container_exists() { + image_exists $(cat $cid_file) +} + +container_ip() { + if [ ! -z "$DOCKER_HOST" ] && [[ "$OSTYPE" =~ 'darwin' ]]; then + docker-machine ip + else + docker inspect --format="{{ .NetworkSettings.IPAddress }}" $(cat $cid_file) + fi +} + +container_port() { + if [ ! -z "$DOCKER_HOST" ] && [[ "$OSTYPE" =~ 'darwin' ]]; then + docker inspect --format="{{(index .NetworkSettings.Ports \"$test_port/tcp\" 0).HostPort}}" "$(cat "${cid_file}")" + else + echo $test_port + fi +} + +run_s2i_build() { + prefix=$1 + s2i build --incremental=true ${s2i_args} file://${test_dir}/${prefix}-template-app ${IMAGE_NAME} ${IMAGE_NAME}-testapp +} + +prepare() { + prefix=$1 + if ! image_exists ${IMAGE_NAME}; then + echo "ERROR: The image ${IMAGE_NAME} must exist before this script is executed." + exit 1 + fi + # s2i build requires the application is a valid 'Git' repository + pushd ${test_dir}/${prefix}-template-app >/dev/null + git init + git config user.email "build@localhost" && git config user.name "builder" + git add -A && git commit -m "Sample commit" + popd >/dev/null + run_s2i_build ${prefix} +} + +run_test_application() { + docker run --rm --cidfile=${cid_file} -p ${test_port} ${IMAGE_NAME}-testapp +} + +cleanup() { + if [ -f $cid_file ]; then + if container_exists; then + docker stop $(cat $cid_file) + fi + fi + if image_exists ${IMAGE_NAME}-testapp; then + docker rmi ${IMAGE_NAME}-testapp + fi +} + +check_result() { + local result="$1" + if [[ "$result" != "0" ]]; then + echo "S2I image '${IMAGE_NAME}' test FAILED (exit code: ${result})" + cleanup + exit $result + fi +} + +wait_for_cid() { + local max_attempts=10 + local sleep_time=1 + local attempt=1 + local result=1 + while [ $attempt -le $max_attempts ]; do + [ -f $cid_file ] && break + echo "Waiting for container to start..." + attempt=$(( $attempt + 1 )) + sleep $sleep_time + done +} + +test_usage() { + echo "Testing 's2i usage'..." + s2i usage ${s2i_args} ${IMAGE_NAME} &>/dev/null +} + +test_seldonMessage() { + local endpoint=$1 + echo "Testing $type HTTP connection (http://$(container_ip):$(container_port)${endpoint})" + local max_attempts=10 + local sleep_time=1 + local attempt=1 + local result=1 + while [ $attempt -le $max_attempts ]; do + data='{"data":{"names":["a","b"],"ndarray":[[1.0,2.0]]}}' + echo "Sending GET request to http://$(container_ip):$(container_port)${endpoint}" + response_code=$(curl -s -w %{http_code} -o /dev/null -d "json=${data}" http://$(container_ip):$(container_port)${endpoint}) + status=$? + if [ $status -eq 0 ]; then + if [ $response_code -eq 200 ]; then + result=0 + fi + break + fi + attempt=$(( $attempt + 1 )) + sleep $sleep_time + done + return $result +} + +test_feedback() { + local endpoint=$1 + echo "Testing $type HTTP connection (http://$(container_ip):$(container_port)${endpoint})" + local max_attempts=10 + local sleep_time=1 + local attempt=1 + local result=1 + while [ $attempt -le $max_attempts ]; do + data='{"request":{"data":{"names":["a","b"],"ndarray":[[1.0,2.0]]}},"response":{"meta":{"routing":{"router":0}},"data":{"names":["a","b"],"ndarray":[[1.0,2.0]]}},"reward":1}' + echo "Sending GET request to http://$(container_ip):$(container_port)${endpoint}" + response_code=$(curl -s -w %{http_code} -o /dev/null -d "json=${data}" http://$(container_ip):$(container_port)${endpoint}) + status=$? + if [ $status -eq 0 ]; then + if [ $response_code -eq 200 ]; then + result=0 + fi + break + fi + attempt=$(( $attempt + 1 )) + sleep $sleep_time + done + return $result +} + +# Build the application image twice to ensure the 'save-artifacts' and +# 'restore-artifacts' scripts are working properly +array=( 'transformer' 'model' 'router' ) +for i in "${array[@]}" +do + cid_file=$($MKTEMP_EXEC -u --suffix=.cid) + echo $i + + prepare ${i} + run_s2i_build ${i} + check_result $? + + # Verify the 'usage' script is working properly + test_usage + check_result $? + + # Verify that the HTTP connection can be established to test application container + run_test_application & + + # Wait for the container to write its CID file + wait_for_cid + + if [ "$i" = "model" ]; then + test_seldonMessage "/predict" + check_result $? + elif [ "$i" = "router" ]; then + test_seldonMessage "/route" + check_result $? + test_feedback "/send-feedback" + check_result $? + elif [ "$i" = "transformer" ]; then + test_seldonMessage "/transform-input" + check_result $? + test_seldonMessage "/transform-output" + check_result $? + fi + + + + cleanup +done diff --git a/wrappers/s2i/R/test/transformer-template-app/.s2i/environment b/wrappers/s2i/R/test/transformer-template-app/.s2i/environment new file mode 100644 index 0000000000..67f83d2f96 --- /dev/null +++ b/wrappers/s2i/R/test/transformer-template-app/.s2i/environment @@ -0,0 +1,4 @@ +MODEL_NAME=MyTransformer.R +API_TYPE=REST +SERVICE_TYPE=TRANSFORMER +PERSISTENCE=0 diff --git a/wrappers/s2i/R/test/transformer-template-app/MyTransformer.R b/wrappers/s2i/R/test/transformer-template-app/MyTransformer.R new file mode 100644 index 0000000000..b1acfbd0a4 --- /dev/null +++ b/wrappers/s2i/R/test/transformer-template-app/MyTransformer.R @@ -0,0 +1,21 @@ +library(methods) + +transform_input.f <- function(f,newdata=list()) { + write("Transform input", stdout()) + newdata +} + +transform_output.f <- function(f,newdata=list()) { + write("Transform output", stdout()) + newdata +} + +new_f <- function() { + structure(list(), class = "f") +} + + +initialise_seldon <- function(params) { + new_f() +} + diff --git a/wrappers/testing/tester.py b/wrappers/testing/tester.py index 80bf0671b9..e163f974b1 100644 --- a/wrappers/testing/tester.py +++ b/wrappers/testing/tester.py @@ -140,7 +140,7 @@ def run(args): REST_request = gen_REST_request(batch,features=feature_names,tensor=args.tensor) if args.prnt: print(REST_request) - + response = requests.post( REST_url, data={"json":json.dumps(REST_request),"isDefault":True})