Header menu link for other important links
X
Noise is inside me! generating adversarial perturbations with noise derived from natural filters
A. Agarwal, , , N.K. Ratha
Published in IEEE Computer Society
2020
Volume: 2020-June
   
Pages: 3354 - 3363
Abstract
Deep learning solutions are vulnerable to adversarial perturbations and can lead a "frog" image to be misclassified as a "deer" or random pattern into "guitar". Adversarial attack generation algorithms generally utilize the knowledge of database and CNN model to craft the noise. In this research, we present a novel scheme termed as Camera Inspired Perturbations to generate adversarial noise. The proposed approach relies on the noise embedded in the image due to environmental factors or camera noise incorporated. We extract these noise patterns using image filtering algorithms and incorporate them into images to generate adversarial images. Unlike most of the existing algorithms that require learning of noise, the proposed adversarial noise can be applied in real-time. It is model-agnostic and can be utilized to fool multiple deep learning classifiers on various databases. The effectiveness of the proposed approach is evaluated on five different databases with five different convolutional neural networks such as ResNet-50, VGG-16, and VGG-Face. The proposed attack reduces the classification accuracy of every network, for instance, the performance of VGG-16 on the Tiny ImageNet database is reduced by more than 33%. The robustness of the proposed adversarial noise is also evaluated against different adversarial defense algorithms. © 2020 IEEE.
About the journal
JournalData powered by TypesetIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
PublisherData powered by TypesetIEEE Computer Society
ISSN21607508