Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Downgrade R in Docker image #685

Merged
merged 6 commits into from
Jun 2, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ version: 2

defaults: &defaults
docker:
- image: palantirtechnologies/circle-spark-base:0.2.2
- image: palantirtechnologies/circle-spark-base:0.2.3
resource_class: xlarge
environment: &defaults-environment
TERM: dumb
Expand Down Expand Up @@ -129,7 +129,7 @@ jobs:
<<: *defaults
# Some part of the maven setup fails if there's no R, so we need to use the R image here
docker:
- image: palantirtechnologies/circle-spark-r:0.2.2
- image: palantirtechnologies/circle-spark-r:0.2.3
steps:
# Saves us from recompiling every time...
- restore_cache:
Expand Down Expand Up @@ -296,7 +296,7 @@ jobs:
# depends on build-sbt, but we only need the assembly jars
<<: *defaults
docker:
- image: palantirtechnologies/circle-spark-python:0.2.2
- image: palantirtechnologies/circle-spark-python:0.2.3
parallelism: 2
steps:
- *checkout-code
Expand All @@ -321,7 +321,7 @@ jobs:
# depends on build-sbt, but we only need the assembly jars
<<: *defaults
docker:
- image: palantirtechnologies/circle-spark-r:0.2.2
- image: palantirtechnologies/circle-spark-r:0.2.3
steps:
- *checkout-code
- attach_workspace:
Expand Down Expand Up @@ -434,7 +434,7 @@ jobs:
<<: *defaults
# Some part of the maven setup fails if there's no R, so we need to use the R image here
docker:
- image: palantirtechnologies/circle-spark-r:0.2.2
- image: palantirtechnologies/circle-spark-r:0.2.3
steps:
- *checkout-code
- restore_cache:
Expand All @@ -454,7 +454,7 @@ jobs:
deploy-gradle:
<<: *defaults
docker:
- image: palantirtechnologies/circle-spark-r:0.2.2
- image: palantirtechnologies/circle-spark-r:0.2.3
steps:
- *checkout-code
- *restore-gradle-wrapper-cache
Expand All @@ -466,7 +466,7 @@ jobs:
<<: *defaults
# Some part of the maven setup fails if there's no R, so we need to use the R image here
docker:
- image: palantirtechnologies/circle-spark-r:0.2.2
- image: palantirtechnologies/circle-spark-r:0.2.3
steps:
# This cache contains the whole project after version was set and mvn package was called
# Restoring first (and instead of checkout) as mvn versions:set mutates real source code...
Expand Down
2 changes: 1 addition & 1 deletion dev/docker-images/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

.PHONY: all publish base python r

VERSION=0.2.2
VERSION=0.2.3
BASE_IMAGE_NAME = "palantirtechnologies/circle-spark-base:${VERSION}"
PYTHON_IMAGE_NAME = "palantirtechnologies/circle-spark-python:${VERSION}"
R_IMAGE_NAME = "palantirtechnologies/circle-spark-r:${VERSION}"
Expand Down
2 changes: 1 addition & 1 deletion dev/docker-images/base/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ WORKDIR $CIRCLE_HOME
# Install miniconda, we are using it to test conda support and a bunch of tests expect CONDA_BIN to be set
ENV CONDA_ROOT=$CIRCLE_HOME/miniconda
ENV CONDA_BIN=$CIRCLE_HOME/miniconda/bin/conda
ENV MINICONDA2_VERSION=4.5.11
ENV MINICONDA2_VERSION=4.7.12.1
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to get a later Python 2 version. There is a bad interaction between libffi and older Python versions (conda/conda#9884).


RUN curl -sO https://repo.anaconda.com/miniconda/Miniconda2-${MINICONDA2_VERSION}-Linux-x86_64.sh \
&& bash Miniconda2-${MINICONDA2_VERSION}-Linux-x86_64.sh -b -p ${CONDA_ROOT} \
Expand Down
20 changes: 11 additions & 9 deletions dev/docker-images/python/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,27 +15,29 @@
# limitations under the License.
#

FROM palantirtechnologies/circle-spark-base
FROM palantirtechnologies/circle-spark-base:0.2.3
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise this defaults to latest.


# Install pyenv
ENV PATH="$CIRCLE_HOME/.pyenv/bin:$PATH"
RUN curl -L https://github.com/pyenv/pyenv-installer/raw/master/bin/pyenv-installer | bash \
&& cat >>.bashrc <<<'eval "$($HOME/.pyenv/bin/pyenv init -)"' \
&& cat >>.bashrc <<<'eval "$($HOME/.pyenv/bin/pyenv virtualenv-init -)"'
RUN curl -L https://github.com/pyenv/pyenv-installer/raw/master/bin/pyenv-installer | bash
ENV PATH="$CIRCLE_HOME/.pyenv/shims:$CIRCLE_HOME/.pyenv/bin:$PATH"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Had to replace the .bashrc edits so they don't override my path hackery below.


# Must install numpy 1.11 or else a bunch of tests break due to different output formatting on e.g. nparray
# A version I've tested earlier that I know it breaks with is 1.14.1
RUN mkdir -p $(pyenv root)/versions \
&& ln -s $CONDA_ROOT $(pyenv root)/versions/our-miniconda \
&& $CONDA_BIN create -y -n python2 -c anaconda -c conda-forge python==2.7.15 numpy=1.14.0 pyarrow==0.12.1 pandas nomkl \
&& $CONDA_BIN create -y -n python2 -c anaconda -c conda-forge python==2.7.18 numpy=1.14.0 pyarrow==0.12.1 pandas nomkl \
&& $CONDA_BIN create -y -n python3 -c anaconda -c conda-forge python=3.6 numpy=1.14.0 pyarrow==0.12.1 pandas nomkl \
&& $CONDA_BIN clean --all

RUN pyenv global our-miniconda/envs/python2 our-miniconda/envs/python3 \
&& pyenv rehash

# Expose pyenv globally
ENV PATH=$CIRCLE_HOME/.pyenv/shims:$PATH

RUN PYENV_VERSION=our-miniconda/envs/python2 $CIRCLE_HOME/.pyenv/shims/pip install unishark "unittest-xml-reporting<3"
RUN PYENV_VERSION=our-miniconda/envs/python3 $CIRCLE_HOME/.pyenv/shims/pip install unishark unittest-xml-reporting

# Conda 'activate' should take precedence over pyenv's (pyenv/pyenv-virtualenv#270).
# In run-pip-tests, we do 'source activate /path'. Conda's activate is fine with that, but not pyenv's.
RUN mkdir $CIRCLE_HOME/.bin
RUN ln -s $CONDA_ROOT/bin/activate $CIRCLE_HOME/.bin/activate
RUN ln -s $CONDA_ROOT/bin/deactivate $CIRCLE_HOME/.bin/deactivate
ENV PATH="$CIRCLE_HOME/.bin:$PATH"
21 changes: 11 additions & 10 deletions dev/docker-images/r/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,16 +15,17 @@
# limitations under the License.
#

FROM palantirtechnologies/circle-spark-base
FROM palantirtechnologies/circle-spark-base:0.2.3

USER root

### Install R
RUN apt-get update \
&& apt-get install r-base r-base-dev qpdf \
&& rm -rf /var/lib/apt/lists/* \
&& chmod 777 /usr/local/lib/R/site-library \
&& /usr/lib/R/bin/R -e "install.packages(c('devtools'), repos='http://cran.us.r-project.org', lib='/usr/local/lib/R/site-library'); devtools::install_github('r-lib/testthat@v2.0.0', lib='/usr/local/lib/R/site-library'); install.packages(c('knitr', 'rmarkdown', 'e1071', 'survival', 'roxygen2', 'lintr'), repos='http://cran.us.r-project.org', lib='/usr/local/lib/R/site-library')"
ENV R_HOME=/usr/lib/R
# Install R and SparkR dependencies
RUN $CONDA_BIN install --yes --channel r \
r-base=3.4.3 r-knitr r-rmarkdown r-e1071 r-survival r-testthat r-lintr r-roxygen2 r-devtools

# Install qpdf (used for SparkR documentation, required in R tests)
USER root
RUN apt-get update && apt-get install qpdf
USER circleci

# Add conda-installed R to path
ENV PATH="$CONDA_ROOT/bin:$PATH"
ENV R_HOME=$CONDA_ROOT/lib/R