TY - JOUR
T1 - An effective adversarial attack on person re-identification in video surveillance via dispersion reduction
AU - Zheng, Yu
AU - Lu, Yantao
AU - Velipasalar, Senem
N1 - Publisher Copyright:
© 2020 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.
PY - 2020
Y1 - 2020
N2 - Person re-identification across a network of cameras, with disjoint views, has been studied extensively due to its importance in wide-area video surveillance. This is a challenging task due to several reasons including changes in illumination and target appearance, and variations in camera viewpoint and camera intrinsic parameters. The approaches developed to re-identify a person across different camera views need to address these challenges. More recently, neural network-based methods have been proposed to solve the person re-identification problem across different camera views, achieving state-of-the-art performance. In this paper, we present an effective and generalizable attack model that generates adversarial images of people, and results in very significant drop in the performance of the existing state-of-the-art person re-identification models. The results demonstrate the extreme vulnerability of the existing models to adversarial examples, and draw attention to the potential security risks that might arise due to this in video surveillance. Our proposed attack is developed by decreasing the dispersion of the internal feature map of a neural network to degrade the performance of several different state-of-the-art person re-identification models. We also compare our proposed attack with other state-of-the-art attack models on different person reidentification approaches, and by using four different commonly used benchmark datasets. The experimental results show that our proposed attack outperforms the state-of-art attack models on the best performing person re-identification approaches by a large margin, and results in the most drop in the mean average precision values.
AB - Person re-identification across a network of cameras, with disjoint views, has been studied extensively due to its importance in wide-area video surveillance. This is a challenging task due to several reasons including changes in illumination and target appearance, and variations in camera viewpoint and camera intrinsic parameters. The approaches developed to re-identify a person across different camera views need to address these challenges. More recently, neural network-based methods have been proposed to solve the person re-identification problem across different camera views, achieving state-of-the-art performance. In this paper, we present an effective and generalizable attack model that generates adversarial images of people, and results in very significant drop in the performance of the existing state-of-the-art person re-identification models. The results demonstrate the extreme vulnerability of the existing models to adversarial examples, and draw attention to the potential security risks that might arise due to this in video surveillance. Our proposed attack is developed by decreasing the dispersion of the internal feature map of a neural network to degrade the performance of several different state-of-the-art person re-identification models. We also compare our proposed attack with other state-of-the-art attack models on different person reidentification approaches, and by using four different commonly used benchmark datasets. The experimental results show that our proposed attack outperforms the state-of-art attack models on the best performing person re-identification approaches by a large margin, and results in the most drop in the mean average precision values.
KW - Adversarial attack
KW - Adversarial examples
KW - Person re-identification
UR - http://www.scopus.com/inward/record.url?scp=85102802068&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85102802068&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2020.3024149
DO - 10.1109/ACCESS.2020.3024149
M3 - Article
AN - SCOPUS:85102802068
SN - 2169-3536
VL - 8
SP - 183891
EP - 183902
JO - IEEE Access
JF - IEEE Access
ER -