Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support fate v1.11.1 #30

Merged
merged 20 commits into from
Apr 27, 2023
9 changes: 9 additions & 0 deletions docker-build/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,9 @@ FATE_DIR=/root/FATE bash build.sh all
| `Build_NN` | Build images containing the NN algorithm | 1 |
| `Build_Spark` | Build images of the Spark computing engine | 1 |
| `Build_IPCL` | Build images that supports IPCL | 0 |
| `IPCL_PKG_DIR` | IPCL code path | None |
| `IPCL_VERSION` | IPCL version | v1.1.3 |
| `Build_GPU` | Build images that supports GPU | 0 |

The command creates the base images and then the component images. After the command finishes, all images of FATE should be created. Use `docker images` to check the newly generated images:

Expand All @@ -114,6 +117,12 @@ federatedai/python <TAG>
federatedai/base-image <TAG>
```

Build all, if you want to build all types of images, you can use the following command.

```sh
FATE_DIR=/root/FATE TAG=1.11.1-release Build_Basic=1 Build_NN=1 Build_FUM=1 Build_Spark=1 Build_OP=1 Build_IPCL=1 Build_GPU=1 IPCL_PKG_DIR=/root/pailliercryptolib_python/ IPCL_VERSION=v1.1.3 bash docker-build/build.sh all
``

### Pushing images to a registry (optional)

To share the docker images with multiple nodes, images can be pushed to a registry (such as Docker Hub or Harbor registry).
Expand Down
9 changes: 9 additions & 0 deletions docker-build/README_zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,9 @@ FATE_DIR=/root/FATE bash build.sh all
| `Build_NN` | 构建包含NN算法的镜像 | 1 |
| `Build_Spark` | 构建Spark计算引擎的镜像 | 1 |
| `Build_IPCL` | 构建支持IPCL的镜像 | 0 |
| `IPCL_PKG_DIR` | IPCL的代码路径 | 无 |
| `IPCL_VERSION` | IPCL的版本号 | v1.1.3 |
| `Build_GPU` | 构建支持GPU的镜像 | 0 |

所有用于构建镜像的“ Dockerfile”文件都存储在“docker/“子目录下。在脚本运行完之后,用户可以通过以下命令来检查构建好的镜像:

Expand All @@ -115,6 +118,12 @@ federatedai/python <TAG>
federatedai/base-image <TAG>
```

全部构建,如果想要构建全部类型的镜像可以使用下面的命令。

```sh
FATE_DIR=/root/FATE TAG=1.11.1-release Build_Basic=1 Build_NN=1 Build_FUM=1 Build_Spark=1 Build_OP=1 Build_IPCL=1 Build_GPU=1 IPCL_PKG_DIR=/root/pailliercryptolib_python/ IPCL_VERSION=v1.1.3 bash docker-build/build.sh all
```

### 把镜像推送到镜像仓库(可选)

如果用户需要把构建出来的镜像推送到镜像仓库如DockerHub去的话,需要先通过以下命令登录相应的用户:
Expand Down
4 changes: 2 additions & 2 deletions docker-build/base/basic/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ RUN set -eux && \
rpm --rebuilddb && \
rpm --import /etc/pki/rpm-gpg/RPM* && \
yum -y install gcc gcc-c++ make openssl-devel supervisor gmp-devel mpfr-devel libmpc-devel \
libaio numactl autoconf automake libtool libffi-devel snappy snappy-devel zlib zlib-devel bzip2 bzip2-devel lz4-devel libasan lsof xz-devel && \
libaio numactl autoconf automake libtool libffi-devel snappy snappy-devel zlib zlib-devel bzip2 bzip2-devel lz4-devel libasan lsof xz-devel sqlite-devel && \
yum clean all

RUN curl -o Python-3.8.13.tar.xz https://www.python.org/ftp/python/3.8.13/Python-3.8.13.tar.xz && \
Expand All @@ -38,4 +38,4 @@ RUN pip install --upgrade pip && \
# sed -i '/torchvision.*/d' /data/projects/python/requirements.txt && \
sed -i '/pytorch-lightning.*/d' /data/projects/python/requirements.txt && \
sed -i '/pyspark.*/d' /data/projects/python/requirements.txt && \
pip install -r requirements.txt
pip install --no-cache-dir -r requirements.txt
2 changes: 1 addition & 1 deletion docker-build/base/ipcl/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ SHELL ["/usr/bin/scl", "enable", "devtoolset-8"]

ENV PATH=${PATH}:/opt/python3/bin/

RUN pip install cmake==3.22 wheel
RUN pip install --no-cache-dir cmake==3.22 wheel

# install nasm
RUN wget --no-check-certificate https://www.nasm.us/pub/nasm/releasebuilds/2.15.05/nasm-2.15.05.tar.gz \
Expand Down
55 changes: 54 additions & 1 deletion docker-build/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ set -euxo pipefail
: "${Build_Spark:=1}"
: "${Build_IPCL:=0}"
: "${IPCL_VERSION:=v1.1.3}"
: "${Build_GPU:=0}"

BASE_DIR=$(dirname "$0")
cd $BASE_DIR
Expand Down Expand Up @@ -166,6 +167,37 @@ buildSparkNNCPU(){

}


buildEggrollNNGPU(){
echo "### START BUILDING fateflow-nn-gpu ###"
docker build --build-arg PREFIX=${PREFIX} --build-arg BASE_IMAGE=fateflow --build-arg BASE_TAG=${BASE_TAG} ${Docker_Options} -t ${PREFIX}/fateflow-nn-gpu:${TAG} \
-f ${WORKING_DIR}/modules/gpu/Dockerfile ${PACKAGE_DIR_CACHE}
echo "### FINISH BUILDING fateflow-nn-gpu ###"
echo ""

echo "### START BUILDING eggroll-nn-gpu ###"
docker build --build-arg PREFIX=${PREFIX} --build-arg BASE_IMAGE=eggroll --build-arg BASE_TAG=${BASE_TAG} ${Docker_Options} -t ${PREFIX}/eggroll-nn-gpu:${TAG} \
-f ${WORKING_DIR}/modules/gpu/Dockerfile ${PACKAGE_DIR_CACHE}
echo "### FINISH BUILDING eggroll-nn-gpu ###"
echo ""

}

buildSparkNNGPU(){
echo "### START BUILDING fateflow-spark-nn-gpu ###"
docker build --build-arg PREFIX=${PREFIX} --build-arg BASE_IMAGE=fateflow-spark --build-arg BASE_TAG=${BASE_TAG} ${Docker_Options} -t ${PREFIX}/fateflow-spark-nn-gpu:${TAG} \
-f ${WORKING_DIR}/modules/gpu/Dockerfile ${PACKAGE_DIR_CACHE}
echo "### FINISH BUILDING fateflow-spark-nn-gpu ###"
echo ""

echo "### START BUILDING spark-worker-nn-gpu ###"
docker build --build-arg PREFIX=${PREFIX} --build-arg BASE_IMAGE=spark-worker --build-arg BASE_TAG=${BASE_TAG} ${Docker_Options} -t ${PREFIX}/spark-worker-nn-gpu:${TAG} \
-f ${WORKING_DIR}/modules/gpu/Dockerfile ${PACKAGE_DIR_CACHE}
echo "### FINISH BUILDING spark-worker-nn-gpu ###"
echo ""

}

buildEggrollBasicIPCL(){
echo "### START BUILDING base-ipcl ###"
docker build --build-arg PREFIX=${PREFIX} --build-arg BASE_TAG=${BASE_TAG} ${Docker_Options} -t ${PREFIX}/base-image-ipcl:${TAG} -f ${WORKING_DIR}/base/ipcl/Dockerfile ${PACKAGE_DIR_CACHE}
Expand Down Expand Up @@ -253,7 +285,8 @@ buildModule(){
[ "$Build_IPCL" -gt 0 ] && buildEggrollBasicIPCL
[ "$Build_Spark" -gt 0 ] && [ "$Build_IPCL" -gt 0 ] && buildSparkBasicIPCL
[ "$Build_OP" -gt 0 ] && [ "$Build_IPCL" -gt 0 ] && buildOptionalIPCLModule

[ "$Build_GPU" -gt 0 ] && buildEggrollNNGPU
[ "$Build_GPU" -gt 0 ] && [ "$Build_Spark" -gt 0 ] && buildSparkNNGPU
}

pushImage() {
Expand Down Expand Up @@ -352,6 +385,26 @@ pushImage() {
echo ""
done
fi

if [ "$Build_GPU" -gt 0 ]
then
for module in "eggroll-nn-gpu" "fateflow-nn-gpu" ; do
echo "### START PUSH ${module} ###"
docker push ${PREFIX}/${module}:${TAG}
echo "### FINISH PUSH ${module} ###"
echo ""
done
fi

if [ "$Build_GPU" -gt 0 ] && [ "$Build_Spark" -gt 0 ]
then
for module in "spark-worker-nn-gpu" "fateflow-spark-nn-gpu" ; do
echo "### START PUSH ${module} ###"
docker push ${PREFIX}/${module}:${TAG}
echo "### FINISH PUSH ${module} ###"
echo ""
done
fi
}

# start
Expand Down
10 changes: 7 additions & 3 deletions docker-build/build_docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ source_dir=$(
cd ../
pwd
)
support_modules=(bin conf examples build deploy proxy fate fateflow fateboard eggroll)
support_modules=(bin conf examples build deploy proxy fate fateflow fateboard eggroll doc)
[ "$Build_IPCL" -gt 0 ] && support_modules[${#support_modules[@]}]=ipcl_pkg
environment_modules=(python36 jdk pypi)
packaging_modules=()
Expand Down Expand Up @@ -70,6 +70,10 @@ function packaging_deploy() {
packaging_general_dir "deploy"
}

function packaging_doc() {
packaging_general_dir "doc"
}

function packaging_general_dir() {
dir_name=$1
echo "[INFO] package ${dir_name} start"
Expand Down Expand Up @@ -104,7 +108,7 @@ packaging_fateboard() {
fateboard_version=$(grep -E -m 1 -o "<version>(.*)</version>" ./pom.xml | tr -d '[\\-a-z<>//]' | awk -F "version" '{print $2}')
echo "[INFO] fateboard version "${fateboard_version}

docker run --rm -u $(id -u):$(id -g) -v ${source_dir}/fateboard:/data/projects/fate/fateboard --entrypoint="" maven:3.6-jdk-8 /bin/bash -c "cd /data/projects/fate/fateboard && mvn clean package -DskipTests"
docker run --rm -u $(id -u):$(id -g) -v ${source_dir}/fateboard:/data/projects/fate/fateboard --entrypoint="" maven:3.8-jdk-8 /bin/bash -c "cd /data/projects/fate/fateboard && mvn clean package -DskipTests"
mkdir -p ${package_dir}/fateboard/conf
mkdir -p ${package_dir}/fateboard/ssh
cp ./target/fateboard-${fateboard_version}.jar ${package_dir}/fateboard/
Expand All @@ -121,7 +125,7 @@ packaging_eggroll() {
pull_eggroll
cd ./eggroll
cd ./deploy
docker run --rm -u $(id -u):$(id -g) -v ${source_dir}/eggroll:/data/projects/fate/eggroll --entrypoint="" maven:3.6-jdk-8 /bin/bash -c "cd /data/projects/fate/eggroll/deploy && bash auto-packaging.sh"
docker run --rm -u $(id -u):$(id -g) -v ${source_dir}/eggroll:/data/projects/fate/eggroll --entrypoint="" maven:3.8-jdk-8 /bin/bash -c "cd /data/projects/fate/eggroll/deploy && bash auto-packaging.sh"
mkdir -p ${package_dir}/eggroll
mv ${source_dir}/eggroll/eggroll.tar.gz ${package_dir}/eggroll/
cd ${package_dir}/eggroll/
Expand Down
21 changes: 5 additions & 16 deletions docker-build/modules/client/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,27 +1,16 @@
ARG PREFIX=prefix
ARG BASE_TAG=tag
ARG BASE_IMAGE=image
FROM ${PREFIX}/${BASE_IMAGE}:${BASE_TAG} as data
FROM ${PREFIX}/${BASE_IMAGE}:${BASE_TAG}

WORKDIR /data/projects/fate/

RUN cd /data/projects/fate/fate/python/fate_client; \
python setup.py bdist_wheel;

FROM python:3.8

WORKDIR /data/projects/fate/

RUN apt-get update && apt-get install -y vim && apt-get clean

COPY pipeline /data/projects/fate/pipeline
RUN pip install notebook torch pandas sklearn markupsafe==2.0.1
RUN mkdir /data/projects/fate/logs
COPY --from=data /data/projects/fate/examples /data/projects/fate/examples
COPY --from=data /data/projects/fate/fateflow/examples /data/projects/fate/fateflow/examples
COPY --from=data /data/projects/fate/fate/python/fate_client/dist/fate_client-*-py3-none-any.whl /data/projects/fate/
RUN cd /data/projects/fate/; pip install fate_client-*-py3-none-any.whl; rm fate_client-*-py3-none-any.whl
RUN pip install --no-cache-dir notebook torch pandas sklearn markupsafe==2.0.1

RUN cd /data/projects/fate/fate/; \
pip install -e python/fate_client; \
pip install -e python/fate_test;
ENV FATE_FLOW_IP=fateflow
ENV FATE_FLOW_PORT=9380

Expand Down
2 changes: 1 addition & 1 deletion docker-build/modules/fateflow-spark/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,4 @@ ENV HADOOP_HOME=/data/projects/hadoop-3.2.1
ENV LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/data/projects/hadoop-3.2.1/lib/native
ENV PATH=$PATH:/data/projects/spark-3.1.3-bin-hadoop3.2/bin:/data/projects/hadoop-3.2.1/bin

RUN pip install pyspark==3.1.3
RUN pip install --no-cache-dir pyspark==3.1.3
5 changes: 4 additions & 1 deletion docker-build/modules/fateflow/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,15 @@ COPY fateflow.tar.gz .
COPY eggroll.tar.gz .
COPY examples.tar.gz .
COPY conf.tar.gz .
COPY doc.tar.gz .
COPY fate.env .

RUN tar -xzf fate.tar.gz; \
tar -xzf fateflow.tar.gz; \
tar -xzf eggroll.tar.gz; \
tar -xzf examples.tar.gz; \
tar -xzf conf.tar.gz;
tar -xzf conf.tar.gz; \
tar -xzf doc.tar.gz;

FROM ${PREFIX}/${BASE_IMAGE}:${BASE_TAG}

Expand All @@ -29,6 +31,7 @@ COPY --from=builder /data/projects/fate/eggroll /data/projects/fate/eggroll
COPY --from=builder /data/projects/fate/examples /data/projects/fate/examples
COPY --from=builder /data/projects/fate/conf /data/projects/fate/conf
COPY --from=builder /data/projects/fate/fate.env /data/projects/fate/
COPY --from=builder /data/projects/fate/doc /data/projects/fate/doc

RUN mkdir -p ./fml_agent/data;

Expand Down
11 changes: 11 additions & 0 deletions docker-build/modules/gpu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# runtime environment
ARG PREFIX=prefix
ARG BASE_TAG=tag
ARG BASE_IMAGE=image
FROM ${PREFIX}/${BASE_IMAGE}:${BASE_TAG}

COPY requirements.txt /data/projects/python/
RUN sed -i '/torch==1.13.1+cpu\|torchvision==0.14.1+cpu/s/+cpu//g' /data/projects/python/requirements.txt && \
sed -i '/pyspark.*/d' /data/projects/python/requirements.txt && \
sed -i '/--extra-index-url.*/d' /data/projects/python/requirements.txt
RUN pip uninstall -y torch torchvision && pip install --no-cache-dir -r /data/projects/python/requirements.txt
3 changes: 2 additions & 1 deletion docker-build/modules/nn/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@ ARG BASE_IMAGE=image
FROM ${PREFIX}/${BASE_IMAGE}:${BASE_TAG}

COPY requirements.txt /data/projects/python/
RUN pip install -r /data/projects/python/requirements.txt
RUN sed -i '/pyspark.*/d' /data/projects/python/requirements.txt && \
pip install --no-cache-dir -r /data/projects/python/requirements.txt