Skip to content

Approximate Manifold Defense Against Multiple Adversarial Perturbations

Notifications You must be signed in to change notification settings

jayjaynandy/RBF_CNN-v2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 

Repository files navigation

Approximate Manifold Defense Against Multiple Adversarial Perturbations (updated version)

This version removes numba dependency to train the RBF layer. Older version of code can be found [here].

A shorter version of this paper has been accepted in NeurIPS 2019 Workshop on Machine Learning with Guarantees [pdf], [poster] and the full version of this paper is accepted at IJCNN-2020 [Arxiv Link]. The video presentation of our paper is provided in this youtube link.

Step 1: Training of the RBF layer

Train RBF layer using rbf_training.py. Dependency: Keras + Tensorflow

It allows to train the RBF layer using Tensorflow in a GPU. However, now we need to specify the required number of RBF filters as a hyper-parameter. Follow the instruction provided inline.

Step 2: Training of the CNN network in presence of the trainined RBF layer

Similar to the previous version.

Execute train_rCNN.py to train rCNN model.

Execute train_rCNN+.py to train rCNN+ model. Note that, it would require a single set of adversarial images corresponding to the training images. We recommend to execute train_rCNN.py model for 50 epochs followed by applying PGD attack to create the adversarial training images (at l_inf = 0.05).

Citation

If our code or our results are useful in your reasearch, please consider citing:

@inproceedings{rbfcnn_ijcnn20,
  author={Jay Nandy and Wynne Hsu and Mong{-}Li Lee},
  title={Approximate Manifold Defense Against Multiple Adversarial Perturbations},
  booktitle={International Joint Conference on Neural Networks (IJCNN)},
  year={2020},
}

About

Approximate Manifold Defense Against Multiple Adversarial Perturbations

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages