Skip to content

The official implementation of paper "DIANet:Dense-and-Implicit-Attention-Network".

License

Notifications You must be signed in to change notification settings

gbup-group/DIANet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

59 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DIANet: Dense-and-Implicit-Attention-Network

GitHub GitHub

By Zhongzhan Huang, Senwei Liang, Mingfu Liang and Haizhao Yang.

This repo is the official implementation of "DIANet: Dense-and-Implicit Attention Network" [paper] on CIFAR-100 dataset. Our paper has been accepted for POSTER presentation at the Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20). You can also check with the AAAI proceeding version.

Introduction

DIANet[paper] provides a universal framework that recurrently fuses the information from preceding layers to enhance the attention modeling at each layer. The existing network backbone can embed DIA unit readily by sharing the DIA unit with all the layers. In the implementation, an LSTM-based DIA unit is provided.

image

Install

  • Install PyTorch
  • Clone the DIANet repository
    git clone https://github.com/gbup-group/DIANet.git
    

Usage

python run_code.py

Results

original DIANet(r=4)
ResNet164 73.43 76.67
PreResNet164 76.53 78.20
WRN52-4 79.75 80.99
ResNext101,8x32 81.18 82.46

Notes:

  • After five times experiments, we show the mean performance in the table above.
  • r denote reduction ratio in DIA unit.
  • For more implementation details and parameter settings check the code or the appendix of our paper.
  • Testing on 2*GPU(P100)

Citing DIANet

@inproceedings{huang2020dianet,
  title={DIANet: Dense-and-Implicit Attention Network.},
  author={Huang, Zhongzhan and Liang, Senwei and Liang, Mingfu and Yang, Haizhao},
  booktitle={AAAI},
  pages={4206--4214},
  year={2020}
}

Acknowledgments

Many thanks to bearpaw for his simple and clean framework for image classification task(PyTorch).

About

The official implementation of paper "DIANet:Dense-and-Implicit-Attention-Network".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages