Skip to content

Commit

Permalink
docs: initial tooling for container matrices
Browse files Browse the repository at this point in the history
* Use Alberto's workstation for bandwidth.
* Extract the broad swath of data from
  the NGC containers.
* List container size in GB. Yay!
* Create RST tables from the data.
* Create one page for each container.
  • Loading branch information
mikemckiernan committed Apr 3, 2022
1 parent 186b214 commit 94f230c
Show file tree
Hide file tree
Showing 17 changed files with 653 additions and 14 deletions.
2 changes: 1 addition & 1 deletion docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

This folder contains the scripts necessary to build the repository
documentation. You can view the documentation at
<https://nvidia-merlin.github.io/Merlin/main/Introduction.html>.
<https://nvidia-merlin.github.io/Merlin/main/README.html>.

## Contributing to Docs

Expand Down
3 changes: 3 additions & 0 deletions docs/requirements-doc.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,9 @@ sphinx-multiversion
recommonmark>=0.6
nbsphinx>=0.6

# smx
mergedeep==1.3.4

# packages necessary to run tests and push PRs
# assumes requirements for nvtabular logic are already installed

Expand Down
46 changes: 33 additions & 13 deletions docs/source/containers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,33 +4,53 @@ Merlin Containers
Merlin and the Merlin component libraries are available in Docker containers from the NVIDIA GPU Cloud (NCG) catalog.
Access the catalog of containers at http://ngc.nvidia.com/catalog/containers.

Training Containers
--------------------

The following table identifies the container names, catalog URL, and key Merlin components.

.. list-table::
:widths: 25 50 25
:header-rows: 1

* - Container Name
* - Training Container
- NGC Catalog URL
- Key Merlin Components
* - merlin-tensorflow-inference
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow-inference
- NVTabular, Tensorflow, and Triton Inference Server
* - merlin-pytorch-inference
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch-inference
- NVTabular, PyTorch, and Triton Inference Server
* - merlin-inference
- https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-inference
- NVTabular, HugeCTR, and Triton Inference Server
* - merlin-training
- https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-training
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-training
- NVTabular and HugeCTR
* - merlin-tensorflow-training
- https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-tensorflow-training
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow-training
- NVTabular, TensorFlow, and HugeCTR Tensorflow Embedding plugin
* - merlin-pytorch-training
- https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-pytorch-training
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch-training
- NVTabular and PyTorch

To use these containers, you must install the `NVIDIA Container Toolkit <https://github.com/NVIDIA/nvidia-docker>`_ to provide GPU support for Docker.
You can use the NGC links referenced in the preceding table for more information about how to launch and run these containers.


Inference Containers
--------------------

The following table identifies the container names, catalog URL, and key Merlin components.

.. list-table::
:widths: 25 50 25
:header-rows: 1

* - Inference Container
- NGC Catalog URL
- Key Merlin Components
* - merlin-inference
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-inference
- NVTabular, HugeCTR, and Triton Inference Server
* - merlin-tensorflow-inference
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow-inference
- NVTabular, Tensorflow, and Triton Inference Server
* - merlin-pytorch-inference
- https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch-inference
- NVTabular, PyTorch, and Triton Inference Server

To use these containers, you must install the `NVIDIA Container Toolkit <https://github.com/NVIDIA/nvidia-docker>`_ to provide GPU support for Docker.
You can use the NGC links referenced in the preceding table for more information about how to launch and run these containers.
92 changes: 92 additions & 0 deletions docs/source/generated/nvcr.io-nvidia-merlin-merlin-inference.rst

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
.. table::
:align: left

============================== =================================================================================
Container Release Release 22.03
============================== =================================================================================
**DGX**
-----------------------------------------------------------------------------------------------------------------
**DGX System** * DGX-1

* DGX-2

* DGX A100

* DGX Station

**Operating System** Ubuntu 20.04.3 LTS

**NVIDIA Certified Systems**
-----------------------------------------------------------------------------------------------------------------
**NVIDIA Driver** NVIDIA Driver version 465.19.01

or later is required. However,

if you're running on Data Center

GPUs (formerly Tesla) such as T4,

you can use any of the following

NVIDIA Driver versions:



* 418.40 (or later R418)

* 440.33 (or later R440)

* 450.51 (or later R450)

* 460.27 (or later R460)



**Note**: The CUDA Driver

Compatability Package does not

support all drivers.

**GPU Model** * `NVIDIA Ampere GPU Architecture <https://www.nvidia.com/en-us/geforce/turing>`_

* `Turing <https://www.nvidia.com/en-us/geforce/turing/>`_

* `Volta <https://www.nvidia.com/en-us/data-center/volta-gpu-architecture/>`_

* `Pascal <https://www.nvidia.com/en-us/data-center/pascal-gpu-architecture/>`_

**Base Container Image**
-----------------------------------------------------------------------------------------------------------------
**Container Operating System** Ubuntu 20.04.3 LTS

**Base Container** Triton version 22.02

**CUDA** 11.6.0.021

**RMM** 21.12.00

**cuDF** 21.12.02

**cuDNN** 8.3.2.44+cuda11.5

**Merlin Core** v0.1.1+3.gee1d59d

**Merlin Models** Not applicable

**Merlin Systems** Not applicable

**NVTabular** 0.11.0

**Transformers4Rec** 0.1.6

**HugeCTR** Not applicable

**SM** Not applicable

**PyTorch** Not applicable

**Triton Inference Server** 2.19.0

**Size** 9.72 GB

============================== =================================================================================

Loading

0 comments on commit 94f230c

Please sign in to comment.