Code to produce the results of ArXiv preprint "Adversarial Defense of Image Classification Using a Variational Auto-Encoder".
- Python 3.6
- Tensorflow and Keras
- Cleverhans
- Sklearn
- Scipy, Imageio, matplotlib
The MNIST dataset should be downloaded by the user and stored under data
directory in mat
format. MNIST
The CIFAR-10 dataset can be downloaded by running provided script. CIFAR-10
To train the classifiers, run
python train_classifier.py
To train the VAEs, run
python train_vae.py
To evaluate the attacks and defenses, run
python evaluate_mnist.py
and
python evaluate_cifar.py
Download the 1000 image dataset and pretrained Inception-V3 model checkpoint from the Kaggle competition.
Store the images in a directory named images
, the Inception-V3 model checkpoint in a directory named inception-v3
.
To train the VAE models on the images, run
python train_vae.py
To perform FGSM and I_FGSM attacks on the images, run
python attack.py
The attacked images will be stored in directories with names such as fgsm_images_0.005
where 0.005
indicates the attack hyperparameter epsilon
.
To evaluate the defense on the attacked images, run
python evaluate.py
The results will be saved into a csv
file.