Skip to content

Towards Achieving Adversarial Robustness by Enforcing Feature Consistency Across Bit Planes

Notifications You must be signed in to change notification settings

GaurangSriramanan/BPFC

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Bit Plane Feature Consistency Regularizer

This repository contains code for the implementation of our paper titled "Towards Achieving Adversarial Robustness by Enforcing Feature Consistency Across Bit Planes", accepted at CVPR 2020. Our paper is available on arXiv here.

Our proposed Bit Plane Feature Consistency (BPFC) regularizer achieves adversarial robustness without the use of adversarial samples during training. Prior works which attempt to achieve this have been broken by adaptive attacks subsequently.

Robustness of BPFC Trained Models

Clean and PGD 1000-step accuracy of BPFC trained models and PGD Adversarially trained models across different datasets:

CIFAR-10 F-MNIST MNIST
Method Clean PGD-1000 Clean PGD-1000 Clean PGD-1000
BPFC 82.4% 34.4% 87.2% 67.7% 99.1% 85.7%
PGD 82.7% 47.0% 87.5% 79.1% 99.3% 94.1%

Environment Settings

  • Python: 2.7.15
  • PyTorch: 1.0.0
  • TorchVision: 0.2.2.post3
  • Numpy: 1.17.2

About

Towards Achieving Adversarial Robustness by Enforcing Feature Consistency Across Bit Planes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%