Skip to content

Latest commit

 

History

History
94 lines (61 loc) · 2.66 KB

README.md

File metadata and controls

94 lines (61 loc) · 2.66 KB

Development Docker images

Deep learning development tools using Docker

Build Status

Instructions

Add these lines to your .bashrc or .zshrc, this enables using docker with own user:

export GID=$(id -g)
export USER=$(id -u -n)
export HOST_GROUP=$(id -g -n)
export HOST_UID=$(id -u)

Set Nvidia as default container runtime for Docker in /etc/docker/daemon.json, it is needed to run docker-compose:

{
    "default-runtime": "nvidia",
    "runtimes": {
        "nvidia": {
            "path": "nvidia-container-runtime",
            "runtimeArgs": []
        }
    }
}

GUI, IDE Usage in container

To make development portable and deployable on cloud, it is recommended to run the IDE inside the container. Legacy solution installed the IDE inside the container which resulted in large container sizes (moving around useless packages). Best practice is to mount your IDE inside the container (to be specified in the docker-compose file or docker run) and start them from the container. GUI access is granted on Intel, AMD and NVIDIA GPUs by default. This enables portable code and easier cooperation of teams.

Available Docker images

Supported CUDA versions: 11.8.0 and 12.1.1 in all images, latest tag is not used as a best practice implementation.

xmindai/cuda-cudnn-opengl

DockerHUB link

Content:

  • CUDA-dev
  • CUDNN-dev
  • CUDA OpenGL-dev

xmindai/cuda-cpp

DockerHUB link

Built on xmindai/cuda-cudnn-opengl

  • User layer - support for arbitrary user to log in
  • Development layer from folder general-development, supporting complete, graphical dev life-cycle in a single container

xmindai/cuda-python

DockerHUB link

Built on xmindai/cuda-cpp

Adds CUDA Python libs with pyenv:

  • PyTorch - GPU
  • Tensorflow - GPU
  • CUDA RAPIDS
  • Many more defined in cuda-python-development/requirements.txt

Start container examples

Start interactive session:

docker-compose -f xmind-development/docker-compose.yml run dev

Start ssh service (background):

docker-compose -f xmind-development/docker-compose.yml up -d ssh

Kill ssh service:

docker-compose -f xmind-development/docker-compose.yml down ssh