Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch DLIO Profiler to DFTracer. #208

Merged
merged 22 commits into from
Jul 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
6e0af63
Switch DLIO Profiler to DFTracer.
hariharan-devarajan Jun 18, 2024
e7fce83
Github workflow fixed.
izzet Jun 24, 2024
cee5005
DFTracer pip package address fixed.
izzet Jun 24, 2024
28a388a
Merge pull request #209 from izzet/dependency/dftracer
hariharan-devarajan Jun 24, 2024
90398aa
switch mpi to openmpi
hariharan-devarajan Jun 25, 2024
84ef9f6
added build debug so that symbols are shown.
hariharan-devarajan Jun 25, 2024
6332925
added DEBUG logging for DFTRACER
hariharan-devarajan Jun 25, 2024
65947f0
removed explicit install of DFTracer as it is done during pip.
hariharan-devarajan Jun 25, 2024
02c2e3c
added clanup code.
hariharan-devarajan Jun 25, 2024
274d565
ci cleanup
hariharan-devarajan Jun 25, 2024
27b4a30
Updated for DFTracer changes
hariharan-devarajan Jun 26, 2024
9c27fb4
switched to 1.0.1
hariharan-devarajan Jun 26, 2024
451db23
Switch to release 1.0.2
hariharan-devarajan Jun 26, 2024
fab201e
Package info updated. (#210)
izzet Jul 9, 2024
6267d59
Update dali_tfrecord_reader.py to import PerfTrace and Profile from u…
zhenghh04 Jul 9, 2024
f5a4425
Update dali_npy_reader.py
zhenghh04 Jul 9, 2024
1b1bac6
Update dali_image_reader.py
zhenghh04 Jul 9, 2024
37e3e77
Update custom_npz_reader.py
zhenghh04 Jul 9, 2024
8d4f473
Update custom_torch_data_loader.py [ci-skip]
zhenghh04 Jul 9, 2024
10ffaa7
Update pytorch_checkpointing.py [ci-skip]
zhenghh04 Jul 9, 2024
7915a08
Update native_dali_data_loader.py [ci-skip]
zhenghh04 Jul 9, 2024
f459b17
Publish on PyPI (#211)
izzet Jul 17, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 53 additions & 0 deletions .github/workflows/cd.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
name: Release

on:
release:
types: [published]

permissions:
contents: read

jobs:
release-build:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: actions/setup-python@v5
with:
python-version: "3.x"

- name: Build release distributions
run: |
# NOTE: put your own distribution build steps here.
python -m pip install build
python -m build

- name: Upload distributions
uses: actions/upload-artifact@v4
with:
name: release-dists
path: dist/

pypi-publish:
runs-on: ubuntu-latest

needs:
- release-build

permissions:
id-token: write

steps:
- name: Retrieve release distributions
uses: actions/download-artifact@v4
with:
name: release-dists
path: dist/

- name: Publish release distributions to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.PYPI_DLIO_TOKEN }}
241 changes: 241 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,241 @@
name: Build and Test

on:
pull_request:
branches: [main, dev]
push:

jobs:
build-and-test:
strategy:
fail-fast: false
matrix:
os: [ubuntu-22.04]
gcc: [10]
python: ["3.9", "3.10", "3.11"]
venv: ["via-setup", "via-reqs"]
name: ${{ matrix.os }}-${{ matrix.gcc }}-${{ matrix.python }}-${{ matrix.venv }}
runs-on: ${{ matrix.os }}
env:
CC: gcc-${{ matrix.gcc }}
CXX: g++-${{ matrix.gcc }}
DFTRACER_BUILD_TYPE: "Debug"
DFTRACER_ENABLE: 1
DFTRACER_LOG_LEVEL: "DEBUG"
DLIO_EXEC: ${{ matrix.venv == 'via-setup' && 'dlio_benchmark' || 'python dlio_benchmark/main.py' }}
GOTCHA_DEBUG: 3
OMPI_ALLOW_RUN_AS_ROOT: 1
OMPI_ALLOW_RUN_AS_ROOT_CONFIRM: 1
PYTHON_VER: ${{ matrix.python }}
RDMAV_FORK_SAFE: "1"
VENV_PATH: "/home/runner/work/.venv/${{ matrix.venv }}"
steps:
- name: Clear disc
run: |
sudo rm -rf /usr/share/dotnet
sudo rm -rf /opt/ghc
sudo rm -rf "/usr/local/share/boost"
sudo rm -rf "$AGENT_TOOLSDIRECTORY"
- name: Push checkout
if: github.event_name == 'push'
uses: actions/checkout@v3
- name: PR checkout
if: github.event_name == 'pull_request'
uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up Python ${{ matrix.python }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python }}
- name: Add current directory to PYTHONPATH
if: matrix.venv == 'via-reqs'
run: echo "PYTHONPATH=$(pwd):$PYTHONPATH" >> $GITHUB_ENV
- name: Cache install modules
id: cache-modules
uses: actions/cache@v3
with:
path: ${{ env.VENV_PATH }}
key: ${{ matrix.venv }}-gcc${{ matrix.gcc }}-python${{ matrix.python }}-${{ hashFiles('requirements.txt', 'setup.py') }}
- name: Install system dependencies
run: |
sudo apt update
sudo apt-get install -y $CC $CXX libc6 git
sudo apt-get install -y openmpi-bin openmpi-common libopenmpi-dev python3-dev
- name: Install DLIO via setup.py
if: matrix.venv == 'via-setup' && steps.cache-modules.outputs.cache-hit != 'true'
run: |
echo "venv: ${VENV_PATH} - gcc: $CC"
python -m venv ${VENV_PATH}
source ${VENV_PATH}/bin/activate
pip install --upgrade pip
pip install .[test]
- name: Install DLIO via requirements.txt
if: matrix.venv == 'via-reqs' && steps.cache-modules.outputs.cache-hit != 'true'
run: |
echo "venv: ${VENV_PATH} - gcc: $CC"
python -m venv ${VENV_PATH}
source ${VENV_PATH}/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
- name: test_gen_data
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_gen_data[png-tensorflow] -v
mpirun -np 2 pytest -k test_gen_data[npz-tensorflow] -v
mpirun -np 2 pytest -k test_gen_data[jpeg-tensorflow] -v
mpirun -np 2 pytest -k test_gen_data[tfrecord-tensorflow] -v
mpirun -np 2 pytest -k test_gen_data[hdf5-tensorflow] -v
mpirun -np 2 pytest -k test_gen_data[indexed_binary-tensorflow] -v
mpirun -np 2 pytest -k test_gen_data[mmap_indexed_binary-tensorflow] -v
rm -rf data
- name: test_custom_storage_root_gen_data
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_storage_root_gen_data[png-tensorflow] -v
mpirun -np 2 pytest -k test_storage_root_gen_data[npz-tensorflow] -v
mpirun -np 2 pytest -k test_storage_root_gen_data[jpeg-tensorflow] -v
mpirun -np 2 pytest -k test_storage_root_gen_data[tfrecord-tensorflow] -v
mpirun -np 2 pytest -k test_storage_root_gen_data[hdf5-tensorflow] -v
mpirun -np 2 pytest -k test_storage_root_gen_data[indexed_binary-tensorflow] -v
mpirun -np 2 pytest -k test_storage_root_gen_data[mmap_indexed_binary-tensorflow] -v
rm -rf data
- name: test_train
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_train[png-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[npz-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[jpeg-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[tfrecord-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[hdf5-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[csv-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[png-pytorch-pytorch] -v
mpirun -np 2 pytest -k test_train[npz-pytorch-pytorch] -v
mpirun -np 2 pytest -k test_train[jpeg-pytorch-pytorch] -v
mpirun -np 2 pytest -k test_train[hdf5-pytorch-pytorch] -v
mpirun -np 2 pytest -k test_train[csv-pytorch-pytorch] -v
mpirun -np 2 pytest -k test_train[png-tensorflow-dali] -v
mpirun -np 2 pytest -k test_train[npz-tensorflow-dali] -v
mpirun -np 2 pytest -k test_train[jpeg-tensorflow-dali] -v
mpirun -np 2 pytest -k test_train[hdf5-tensorflow-dali] -v
mpirun -np 2 pytest -k test_train[csv-tensorflow-dali] -v
mpirun -np 2 pytest -k test_train[png-pytorch-dali] -v
mpirun -np 2 pytest -k test_train[npz-pytorch-dali] -v
mpirun -np 2 pytest -k test_train[jpeg-pytorch-dali] -v
mpirun -np 2 pytest -k test_train[hdf5-pytorch-dali] -v
mpirun -np 2 pytest -k test_train[csv-pytorch-dali] -v
mpirun -np 2 pytest -k test_train[indexed_binary-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[indexed_binary-pytorch-pytorch] -v
mpirun -np 2 pytest -k test_train[indexed_binary-tensorflow-dali] -v
mpirun -np 2 pytest -k test_train[indexed_binary-pytorch-dali] -v
mpirun -np 2 pytest -k test_train[mmap_indexed_binary-tensorflow-tensorflow] -v
mpirun -np 2 pytest -k test_train[mmap_indexed_binary-pytorch-pytorch] -v
mpirun -np 2 pytest -k test_train[mmap_indexed_binary-tensorflow-dali] -v
mpirun -np 2 pytest -k test_train[mmap_indexed_binary-pytorch-dali] -v
rm -rf data
- name: test_custom_storage_root_train
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_custom_storage_root_train[png-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[npz-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[jpeg-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[tfrecord-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[hdf5-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[csv-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[png-pytorch] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[npz-pytorch] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[jpeg-pytorch] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[hdf5-pytorch] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[csv-pytorch] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[indexed_binary-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[indexed_binary-pytorch] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[mmap_indexed_binary-tensorflow] -v
mpirun -np 2 pytest -k test_custom_storage_root_train[mmap_indexed_binary-pytorch] -v
rm -rf data
- name: test_checkpoint_epoch
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_checkpoint_epoch[tensorflow-1024-optimizers0-2-layer_params0-all_ranks] -v
mpirun -np 2 pytest -k test_checkpoint_epoch[pytorch-1024-optimizers1-2-layer_params1-all_ranks] -v
mpirun -np 2 pytest -k test_checkpoint_epoch[tensorflow-1024-optimizers2-2-layer_params2-rank_zero] -v
mpirun -np 2 pytest -k test_checkpoint_epoch[pytorch-1024-optimizers3-2-layer_params3-rank_zero] -v
mpirun -np 2 pytest -k test_checkpoint_epoch[tensorflow-1024-optimizers4-1-layer_params4-all_ranks] -v
mpirun -np 2 pytest -k test_checkpoint_epoch[pytorch-1024-optimizers5-1-layer_params5-all_ranks] -v
rm -rf data
- name: test_checkpoint_step
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_checkpoint_step -v
- name: test_eval
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_eval -v
- name: test_multi_threads
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_multi_threads[tensorflow-0] -v
mpirun -np 2 pytest -k test_multi_threads[tensorflow-1] -v
mpirun -np 2 pytest -k test_multi_threads[tensorflow-2] -v
mpirun -np 2 pytest -k test_multi_threads[pytorch-0] -v
mpirun -np 2 pytest -k test_multi_threads[pytorch-1] -v
mpirun -np 2 pytest -k test_multi_threads[pytorch-2] -v
rm -rf data
- name: test-pytorch-multiprocessing-context
run: |
source ${VENV_PATH}/bin/activate
mpirun -np 2 pytest -k test_pytorch_multiprocessing_context[0-None] -v
mpirun -np 2 pytest -k test_pytorch_multiprocessing_context[1-fork] -v
mpirun -np 2 pytest -k test_pytorch_multiprocessing_context[2-forkserver] -v
mpirun -np 2 pytest -k test_pytorch_multiprocessing_context[2-spawn] -v
rm -rf data
- name: test_subset
run: |
source ${VENV_PATH}/bin/activate
rm -rf output data checkpoints
mpirun -np 2 pytest -k test_subset -v
rm -rf data
- name: test-tf-loader-tfrecord
run: |
source ${VENV_PATH}/bin/activate
rm -rf output data checkpoints
mpirun -np 2 ${DLIO_EXEC} workload=resnet50_tf ++workload.dataset.num_files_train=64 ++workload.workflow.train=False ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=4 ++workload.dataset.num_samples_per_file=16
mpirun -np 2 ${DLIO_EXEC} workload=resnet50_tf ++workload.dataset.num_files_train=64 ++workload.workflow.train=True ++workload.workflow.generate_data=False ++workload.dataset.num_files_train=4 ++workload.dataset.num_samples_per_file=16 ++workload.train.computation_time=0.01 ++workload.train.epochs=1
rm -rf data
- name: test-torch-loader-npz
run: |
source ${VENV_PATH}/bin/activate
rm -rf output data checkpoints
mpirun -np 2 ${DLIO_EXEC} workload=unet3d_a100 ++workload.train.computation_time=0.05 ++workload.evaluation.eval_time=0.01 ++workload.workflow.train=False ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=8 ++workload.dataset.num_files_eval=8 ++workload.reader.read_threads=2 ++workload.dataset.record_length=4096 ++workload.dataset.record_length_stdev=0
mpirun -np 2 ${DLIO_EXEC} workload=unet3d_a100 ++workload.train.computation_time=0.05 ++workload.evaluation.eval_time=0.01 ++workload.train.epochs=1 ++workload.workflow.train=True ++workload.workflow.generate_data=False ++workload.dataset.num_files_train=8 ++workload.dataset.num_files_eval=8 ++workload.reader.read_threads=0 ++workload.dataset.record_length=4096 ++workload.dataset.record_length_stdev=0
rm -rf data
- name: test-tf-loader-npz
run: |
source ${VENV_PATH}/bin/activate
rm -rf output data checkpoints
mpirun -np 2 ${DLIO_EXEC} workload=unet3d_a100 ++workload.framework=tensorflow ++workload.data_reader.data_loader=tensorflow ++workload.train.computation_time=0.05 ++workload.evaluation.eval_time=0.01 ++workload.train.epochs=2 ++workload.workflow.train=False ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=16 ++workload.dataset.num_files_eval=16 ++workload.reader.read_threads=2 ++workload.dataset.record_length=4096 ++workload.dataset.record_length_stdev=0
mpirun -np 2 ${DLIO_EXEC} workload=unet3d_a100 ++workload.framework=tensorflow ++workload.data_reader.data_loader=tensorflow ++workload.train.computation_time=0.05 ++workload.evaluation.eval_time=0.01 ++workload.train.epochs=2 ++workload.workflow.train=True ++workload.workflow.generate_data=False ++workload.dataset.num_files_train=16 ++workload.dataset.num_files_eval=16 ++workload.reader.read_threads=2 ++workload.dataset.record_length=4096 ++workload.dataset.record_length_stdev=0
rm -rf data
- name: test_unet3d
run: |
source ${VENV_PATH}/bin/activate
rm -rf output data checkpoints
mpirun -np 2 ${DLIO_EXEC} workload=unet3d_a100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=42
mpirun -np 2 ${DLIO_EXEC} workload=unet3d_h100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=42
mpirun -np 2 ${DLIO_EXEC} workload=unet3d_h100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=42 ++workload.dataset.format=synthetic
rm -rf data
- name: test_resnet50
run: |
source ${VENV_PATH}/bin/activate
rm -rf output data checkpoints
mpirun -np 2 ${DLIO_EXEC} workload=resnet50_a100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=4
mpirun -np 2 ${DLIO_EXEC} workload=resnet50_h100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=4
mpirun -np 2 ${DLIO_EXEC} workload=resnet50_h100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=4 ++workload.dataset.format=synthetic
rm -rf data
- name: test_cosmoflow
run: |
source ${VENV_PATH}/bin/activate
rm -rf output data checkpoints
mpirun -np 2 ${DLIO_EXEC} workload=cosmoflow_a100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=16
mpirun -np 2 ${DLIO_EXEC} workload=cosmoflow_h100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=16
mpirun -np 2 ${DLIO_EXEC} workload=cosmoflow_h100 ++workload.workflow.generate_data=True ++workload.dataset.num_files_train=16 ++workload.dataset.format=synthetic
rm -rf data
4 changes: 2 additions & 2 deletions .github/workflows/jekyll-gh-pages.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Sample workflow for building and deploying a Jekyll site to GitHub Pages
name: Deploy Jekyll with GitHub Pages dependencies preinstalled
name: Deploy Documentation

on:
# Runs on pushes targeting the default branch
Expand Down Expand Up @@ -51,5 +51,5 @@ jobs:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v1
with:
with:
folder: _build/html/
Loading
Loading