Skip to content

mehak126/AlexNet-Attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

AlexNet Attention

An implementation of attention layers to the AlexNet model for ELL881 Fundamentals of Deep Learning.

Attention Model Architecture

Attention layers were added after the 4th and 5th convolution layers of the AlexNet model. The compatibility function can be either the ‘dot product’ (dp) or the ‘parameterised compatibility’ (pc) function, as described in the paper 'Learn to Pay Attention' (link). Similarly, the final classification can be done either by concatenation (concat) of descriptors or by averaging the scores from different levels (indep).

Dataset

The models were trained on a subset of the CIFAR-10 image dataset. 1000 images from the dataset with equal samples from each category were randomly selected. The validation set was of size 300 and the test set contained 500 images.

Attention Map Visualisation

A side by side comparison of the original images and their attention heat maps at different layers for each attention model trained is shown below. Some other images are available in the ‘images’ folder.

1

2

3

4

About

Adding attention layers to AlexNet

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published