Adversarial example generation method for object detection in remote sensing images

Wanghan Jiang,Yue Zhou,Xue Jiang

IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM(2023)

Cited 0|Views5
No score
Abstract
Object detection in remote sensing images is an essential application of deep learning. However, due to the vulnerability of deep learning models, they are susceptible to adversarial attacks, which can undermine their reliability and accuracy. While significant progress has been made in the field of adversarial attacks, most of the work has focused on image classification tasks due to the complexity of object detection. In this paper, we propose a target camouflage method based on adversarial attacks that can mislead detectors and hide targets with minimal pixel perturbations. Experiments on the DIOR dataset demonstrate the effectiveness of our approach. Our method generates adversarial examples that can successfully fool Faster R-CNN into failing to detect objects with minimal perturbations.
More
Translated text
Key words
Object detection,Adversarial attacks,target camouflage
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined