Skip to content

garytyler/arch-id-model

Repository files navigation

arch-id-model

ArchitectureID.ai’s data model development.

For the ArchitectureID.ai web application repo, see arch-id-web.

Quick Start

The recommended method for running the program is in a container built from the included ./Dockerfile using non-root (optional) docker with NVIDIA Container Toolkit for GPU support. The included run script ./run.sh will call the necessary docker commands to build the container and run the program in it.

The run script ./run.sh accepts the following environment variables:

  • DATASET_DIR (required):
    dataset root directory with structure: root_dir/category_dir/image_files. replaces --dataset-dir CLI arg.
  • OUTPUT_DIR (required):
    directory for all program output. can be reused for multiple sessions. a new sub-directory will be created for each session. replaces --output-dir cli arg.
  • CUDA_VISIBLE_DEVICES (optional):
    a comma-separated list of integers reflecting the Bus ID of the GPUs to expose to the container with NVIDIA Container Toolkit. defaults to 0.
  • TF_CPP_MIN_LOG_LEVEL (optional):
    set C++ tensorflow log level. accepts one of 0,1,2,3. defaults to 2.
  • TF_ENABLE_AUTO_MIXED_PRECISION (optional):
    boolean to enable mixed precision. defaults to 1.

*Notice the run script accepts environment variables DATASET_DIR and OUTPUT_DIR in place of CLI args --dataset-dir and --output-dir. All other CLI options can be passed to the script as they would if calling the program directly from your shell.

For example, to train on a GPU with Bus ID #2 with a minimum accuracy of 0.7 for a maximum of 200 epochs, with dataset directory ~/dataset and output directory ~/output:

CUDA_VISIBLE_DEVICES=2 DATASET_DIR=~/dataset OUTPUT_DIR=~/output ./run.sh train --min-accuracy=.7 --max-epochs=1000

For CLI help:

DATASET_DIR=~/dataset OUTPUT_DIR=~/output ./run.sh train -h

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published