Skip to content

Revisiting and Exploring Efficient Fast Adversarial Training via LAW: Lipschitz Regularization and Auto Weight Averaging (TIFS2024)

Notifications You must be signed in to change notification settings

jiaxiaojunQAQ/FGSM-LAW

Repository files navigation

Revisiting and Exploring Efficient Fast Adversarial Training via LAW:
Lipschitz Regularization and Auto Weight Averaging

Introduction

Adversarial example generation of the proposed FGSM-SDI

Generation process of adversarial examples generated by FGSM in FAT on the maximization loss function. (a) Using a zero initialization. (b) Using the sample initialization. (c) Using the regularization.

In this work, we conduct a comprehensive study of over 10 fast adversarial training methods in terms of adversarial robustness and training costs. We revisit the effectiveness and efficiency of fast adversarial training techniques in preventing Catastrophic Overfitting from the perspective of model local nonlinearity and propose an effective Lipschitz regularization method for fast adversarial training. Furthermore, we explore the effect of data augmentation and weight averaging in fast adversarial training and propose a simple yet effective auto weight averaging method to improve robustness further. By assembling these techniques, we propose a FGSM-based fast adversarial training method equipped with Lipschitz regularization and Auto Weight averaging, abbreviated as FGSM-LAW. Experimental evaluations on four benchmark databases demonstrate the superiority of the proposed method over state-of-the-art fast adversarial training methods and the advanced standard adversarial training methods.

Requirements

  • Platform: Linux
  • Hardware: V100
  • pytorch, etc.

Train

python3 FGSM_LAW_CIFAR10.py  --out_dir ./output/ --data-dir cifar-data

Test

python3.6 test_cifar10.py --model_path model.pth --out_dir ./output/ --data-dir cifar-data

Trained Models

The Trained models will come soon

About

Revisiting and Exploring Efficient Fast Adversarial Training via LAW: Lipschitz Regularization and Auto Weight Averaging (TIFS2024)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages