Skip to content

hls4ml-finn-mlperftiny/release-repo-v0.5

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Tiny MLPerf™ v0.1 hls4ml Xilinx PYNQ-Z2 Open Submission

By hls4ml team hls4ml

Hardware

PYNQ-Z2

Code structure

The code is structured as follows

hls4ml
├── code
│   ├── ad
│   │   └── AD03
│   │       ├── inference
│   │       │   ├── hls
│   │       │   ├── sdk
│   │       │   ├── sys
│   │       │   └── utils
│   │       └── training
│   │           ├── convert.py
│   │           ├── model
│   │           │   └── ad03
│   │           │       └── model_ToyCar.h5
│   │           ├── train.py
│   └── ic
│       └── RN06
│           ├── inference
│           │   ├── hls
│           │   ├── sdk
│           │   ├── sys
│           │   └── utils
│           └── training
│               ├── convert.py
│               ├── resnet_v1_eembc_RN06
│               │   └── model_best.h5
│               └── train.py
├── results
│   └── pynqz2
│       ├── ad
│       │   ├── accuracy
│       │   └── performance
│       └── ic
│           ├── accuracy
│           └── performance
└── systems

  • For both the anomaly detection model (AD03) and the image classification model (RN06), there are training and inference subdirectories.
  • Under training, there are scripts to train the model with QKeras (train.py) and convert it to a Xilinx HLS/Vivado/SDK project using hls4ml (convert.py).
  • The configruation is controlled by yml files.
  • For convenience, the pretrained models in .h5 format are provided in the repository as indicated.
  • Under inference, the Xilinx HLS, Vivado, and SDK projects will be automatically created after successfully running convert.py in the hls, sys, and sdk folders respectively.

Setup

conda-env create -f environment.yml
  • Activate the environment:
conda activate tiny-mlperf-env
<path_to_Vivado>/Vivado/2019.1/data/boards/board_files
  • Setup Vivado 2019.1:
source <path_to_Vivado>/Vivado/2019.1/settings64.sh
  • Ensure PYNQ-Z2 board is connected (and powered) by USB and visible.

RN06 running

Training with QKeras

In this step, you will download the dataset and perform a quantization-aware training with QKeras.

AD03 model

  • Change directory
cd code/ad/AD03/training/
  • Download dataset for AD03:
./get_dataset.sh
  • Train AD03, pretrained model is provided as model/ad03/model_ToyCar.h5:
python train.py -c AD03.yml

n.b. if you don't have a GPU, you can comment out the import setGPU (true also for later python scripts)

RN06 model

  • Change directory
cd code/ic/RN06/training/
  • Train RN06, pretrained model is provided as resnet_v1_eembc_RN06/model_best.h5:
python train.py -c RN06_pynqz2.yml

Conversion with hls4ml

In this step, you will ingest the quantization-aware training performed in the previous step and convert it to firmware using hls4ml. The hls4ml configuration, pynqz2.yml has details such as the implementation architecture.

AD03 model

  • Change directory
cd code/ad/AD03/training/
  • Get test data:
python generate_test_data.py -c AD03.yml
  • Convert AD03:
python convert.py -c pynqz2.yml

RN06 model

  • Change directory
cd code/ic/RN06/training/
  • Get test data:
source get_test_data.sh
  • Convert RN06:
python convert.py -c RN06_pynqz2.yml

Program FPGA and run software

  • Change directory
cd code/ic/<model_name>/inference/sdk/
  • Open Xilinx SDK GUI
make gui
  • Program the FPGA with the bit file in SDK
    • Screen Shot 2021-06-06 at 10 20 14 PM
  • Run test harness software in SDK
    • Screen Shot 2021-06-06 at 10 22 07 PM
  • Download EEMBC runner GUI and AD/IC benchmark datasets (See https://github.com/eembc/ulpmark-ml)
  • Open EEMBC runner GUI and and perform measurements, follow the instructions on the eembc README
    • Screen Shot 2021-06-06 at 10 18 51 PM

Boot from Flash

The PYNQ--Z2 supports Quad SPI Flash. Please follow these instructions to program and boot from the Flash memory.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published