Skip to content

Commit

Permalink
add new seldon base image
Browse files Browse the repository at this point in the history
  • Loading branch information
dtrawins committed Feb 5, 2019
1 parent a950a0f commit d7f71b6
Show file tree
Hide file tree
Showing 2 changed files with 124 additions and 0 deletions.
79 changes: 79 additions & 0 deletions wrappers/s2i/python_openvino/Dockerfile_openvino_base
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
FROM intelpython/intelpython3_core as DEV
RUN apt-get update && apt-get install -y \
curl \
ca-certificates \
libgfortran3 \
vim \
build-essential \
cmake \
curl \
wget \
libssl-dev \
ca-certificates \
git \
libboost-regex-dev \
gcc-multilib \
g++-multilib \
libgtk2.0-dev \
pkg-config \
unzip \
automake \
libtool \
autoconf \
libpng-dev \
libcairo2-dev \
libpango1.0-dev \
libglib2.0-dev \
libgtk2.0-dev \
libswscale-dev \
libavcodec-dev \
libavformat-dev \
libgstreamer1.0-0 \
gstreamer1.0-plugins-base \
libusb-1.0-0-dev \
libopenblas-dev

ARG DLDT_DIR=/dldt-2018_R5
RUN git clone --depth=1 -b 2018_R5 https://github.com/opencv/dldt.git ${DLDT_DIR} && \
cd ${DLDT_DIR} && git submodule init && git submodule update --recursive && \
rm -Rf .git && rm -Rf model-optimizer

WORKDIR ${DLDT_DIR}
RUN curl -L -o ${DLDT_DIR}/mklml_lnx_2019.0.1.20180928.tgz https://github.com/intel/mkl-dnn/releases/download/v0.17.2/mklml_lnx_2019.0.1.20180928.tgz && \
tar -xzf ${DLDT_DIR}/mklml_lnx_2019.0.1.20180928.tgz && rm ${DLDT_DIR}/mklml_lnx_2019.0.1.20180928.tgz
WORKDIR ${DLDT_DIR}/inference-engine
RUN mkdir build && cd build && cmake -DGEMM=MKL -DMKLROOT=${DLDT_DIR}/mklml_lnx_2019.0.1.20180928 -DENABLE_MKL_DNN=ON -DCMAKE_BUILD_TYPE=Release ..
RUN cd build && make -j4
RUN pip install cython numpy && mkdir ie_bridges/python/build && cd ie_bridges/python/build && \
cmake -DInferenceEngine_DIR=${DLDT_DIR}/inference-engine/build -DPYTHON_EXECUTABLE=`which python` -DPYTHON_LIBRARY=/opt/conda/lib/libpython3.6m.so -DPYTHON_INCLUDE_DIR=/opt/conda/include/python3.6m .. && \
make -j4

FROM intelpython/intelpython3_core as PROD

LABEL io.openshift.s2i.scripts-url="image:///s2i/bin"

RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
ca-certificates \
build-essential \
python3-setuptools \
vim

COPY --from=DEV /dldt-2018_R5/inference-engine/bin/intel64/Release/lib/*.so /usr/local/lib/
COPY --from=DEV /dldt-2018_R5/inference-engine/ie_bridges/python/bin/intel64/Release/python_api/python3.6/openvino/ /usr/local/lib/openvino/
COPY --from=DEV /dldt-2018_R5/mklml_lnx_2019.0.1.20180928/lib/lib*.so /usr/local/lib/
ENV LD_LIBRARY_PATH=/usr/local/lib
ENV PYTHONPATH=/usr/local/lib

RUN conda create --name myenv -y
ENV PATH /opt/conda/envs/myenv/bin:$PATH
RUN conda install -y tensorflow opencv && conda clean -a -y
WORKDIR /microservice

RUN pip install wheel
RUN pip install seldon-core
RUN pip install --upgrade setuptools

COPY ./s2i/bin/ /s2i/bin

EXPOSE 5000
45 changes: 45 additions & 0 deletions wrappers/s2i/python_openvino/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Seldon base image with python and OpenVINO inference engine

## Building
```bash

cd ../../../wrappers/s2i/python_openvino
cp ../python/s2i .
docker build -f Dockerfile_openvino_base --build-arg http_proxy=$http_proxy --build-arg https_proxy=$https_proxy \
-t seldon_openvino_base:latest .
```
## Usage

This base image can be used to Seldon components exactly the same way like with standard Seldon base images.
Use s2i tool like documented [here](https://github.com/SeldonIO/seldon-core/blob/master/docs/wrappers/python.md).
An example is presented below:

```bash
s2i build . seldon_openvino_base:latest {component_image_name}
```

## References

[OpenVINO toolkit](https://software.intel.com/en-us/openvino-toolkit)

[OpenVINO API docs](https://software.intel.com/en-us/articles/OpenVINO-InferEngine#inpage-nav-9)

[Seldon pipeline example](../../../examples/models/openvino_imagenet_ensemble)


## Notes

This Seldon base image contains, beside OpenVINO inference execution engine python API also several other useful components.
- Intel optimized python version
- Intel optimized OpenCV package
- Intel optimized TensorFlow with MKL engine
- Configured conda package manager

In case you would use this compoment to run inference operations using TensorFlow with MKL, make sure you configure
also the following environment variables:

`KMP_AFFINITY`=granularity=fine,verbose,compact,1,0

`KMP_BLOCKTIME`=1

`OMP_NUM_THREADS`={number of CPU cores}

0 comments on commit d7f71b6

Please sign in to comment.