Skip to content

[ACM MM'21] Exploring Gradient Flow Based Saliency for DNN Model Compression

License

Notifications You must be signed in to change notification settings

CityU-AIM-Group/GFBS

Repository files navigation

GFBS: Exploring Gradient Flow Based Saliency for DNN Model Compression [ACM MM'21]

Exploring Gradient Flow Based Saliency for DNN Model Compression
Xinyu Liu, Baopu Li, Zhen Chen, Yixuan Yuan
The Chinese Univerisity of Hong Kong, Oracle Cloud Infrastructure (OCI), Centre for Artificial Intelligence and Robotics (CAIR)

We propose GFBS, which is a structured pruning method for deep convolutional neural networks. It analyzes the channel's influence based on Taylor expansion and integrates the effects of BN layer and ReLU activation function. The channel importance can be evaluated with a single batch forward and backpropagation.

Get Started

Install requirements

Run the following command to install the dependences:

pip install -r requirements.txt

Data preparation

For CIFAR-10, the data will be downloaded automatically when pruning.

For ImageNet, We need to prepare the dataset from http://www.image-net.org/.

  • ImageNet-1k

ImageNet-1k contains 1.28 M images for training and 50 K images for validation. The images shall be stored as individual files:

ImageNet/
├── train
│   ├── n01440764
│   │   ├── n01440764_10026.JPEG
│   │   ├── n01440764_10027.JPEG
...
├── val
│   ├── n01440764
│   │   ├── ILSVRC2012_val_00000293.JPEG
...

Training Baseline Models

Before pruning, we need to prepare the baseline unpruned models.

For CIFAR-10:

Run the following command to train a VGG-16BN model for 160 epochs:

python train.py --net gatevgg16

For ImageNet:

We use the torchvision ResNet-50 model instead of training it from scratch. Run the following command to download and convert it for pruning:

python prepare_torch_imagenet_models.py --data_dir <path-to-imagenet>

Pruning with GFBS

For CIFAR-10:

python gfbs_cifar.py --net gatevgg16 --p <channel-pruning-ratio>

this will fintune the pruned model for 160 epochs. Users can also add --smooth to finetune for 30 epochs after pruning each layer. Users can define the desired channel pruning ratio.

For ImageNet:

python gfbs_imagenet.py --net resnet50 --data_dir <path-to-imagenet> --p <channel-pruning-ratio> --gpu 0,1,2,3,4,5,6,7

this will fintune the pruned model for 120 epochs. Users can define the desired channel pruning ratio and number of GPUs to use.

Citation

If you find our project is helpful, please feel free to leave a star and cite our paper:

@inproceedings{liu2021exploring,
  title={Exploring gradient flow based saliency for dnn model compression},
  author={Liu, Xinyu and Li, Baopu and Chen, Zhen and Yuan, Yixuan},
  booktitle={Proceedings of the 29th ACM International Conference on Multimedia},
  pages={3238--3246},
  year={2021}
}

License

About

[ACM MM'21] Exploring Gradient Flow Based Saliency for DNN Model Compression

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages