TY - JOUR
T1 - PTB
T2 - Robust physical backdoor attacks against deep neural networks in real world
AU - Xue, Mingfu
AU - He, Can
AU - Wu, Yinghao
AU - Sun, Shichang
AU - Zhang, Yushu
AU - Wang, Jian
AU - Liu, Weiqiang
N1 - Publisher Copyright:
© 2022 Elsevier Ltd
PY - 2022/7
Y1 - 2022/7
N2 - Deep neural networks (DNN) models have been widely applied in many tasks. However, recent researches have shown that DNN models are vulnerable to backdoor attacks. A number of backdoor attacks on DNN models have been proposed, but almost all the existing backdoor attacks are digital backdoor attacks. However, when launching backdoor attacks in the real physical world, the attack performance will be severely degraded due to a variety of physical constraints. In this paper, we propose a robust physical backdoor attack method, named physical transformations for backdoors (PTB), to implement the backdoor attacks against DNN models in real physical world. To the best of our knowledge, we are the first to propose a robust physical backdoor attack with real physical triggers working under complex physical conditions. We use real physical objects as the triggers, and perform a series of physical transformations on the injected backdoor instances during model training, so as to simulate various transformations that a backdoor instance may experience in real physical world, thus ensures its physical robustness. Experimental results on face recognition model demonstrate that, compared with normal backdoor attacks without PTB, the proposed attack method can significantly improve the attack performance in real physical world. Under various complex physical conditions, by injecting only a very small ratio (0.5%) of backdoor instances, the attack success rate of physical backdoor attack with the PTB method is 78% (Square), 82% (Triangle), 79% (Glasses) on YouTube Aligned Face dataset, and 78% (Square), 86% (Triangle), 85% (Glasses) on VGG Face dataset, respectively, while the attack success rate of backdoor attacks without PTB is only 5% (Square), 11% (Triangle), 9% (Glasses) on YouTube Aligned Face dataset and 21% (Square), 20% (Triangle), 13% (Glasses) on VGG Face dataset, respectively. Meanwhile, the proposed method will not affect the normal performance of the DNN model. In addition, experimental results also demonstrate that the proposed robust physical backdoor attack can evade the detection of three backdoor defense methods.
AB - Deep neural networks (DNN) models have been widely applied in many tasks. However, recent researches have shown that DNN models are vulnerable to backdoor attacks. A number of backdoor attacks on DNN models have been proposed, but almost all the existing backdoor attacks are digital backdoor attacks. However, when launching backdoor attacks in the real physical world, the attack performance will be severely degraded due to a variety of physical constraints. In this paper, we propose a robust physical backdoor attack method, named physical transformations for backdoors (PTB), to implement the backdoor attacks against DNN models in real physical world. To the best of our knowledge, we are the first to propose a robust physical backdoor attack with real physical triggers working under complex physical conditions. We use real physical objects as the triggers, and perform a series of physical transformations on the injected backdoor instances during model training, so as to simulate various transformations that a backdoor instance may experience in real physical world, thus ensures its physical robustness. Experimental results on face recognition model demonstrate that, compared with normal backdoor attacks without PTB, the proposed attack method can significantly improve the attack performance in real physical world. Under various complex physical conditions, by injecting only a very small ratio (0.5%) of backdoor instances, the attack success rate of physical backdoor attack with the PTB method is 78% (Square), 82% (Triangle), 79% (Glasses) on YouTube Aligned Face dataset, and 78% (Square), 86% (Triangle), 85% (Glasses) on VGG Face dataset, respectively, while the attack success rate of backdoor attacks without PTB is only 5% (Square), 11% (Triangle), 9% (Glasses) on YouTube Aligned Face dataset and 21% (Square), 20% (Triangle), 13% (Glasses) on VGG Face dataset, respectively. Meanwhile, the proposed method will not affect the normal performance of the DNN model. In addition, experimental results also demonstrate that the proposed robust physical backdoor attack can evade the detection of three backdoor defense methods.
KW - Artificial intelligence securitys
KW - Deep neural networks
KW - Face recognition
KW - Physical backdoor attack
KW - Physical transformations
UR - https://www.scopus.com/pages/publications/85135382600
U2 - 10.1016/j.cose.2022.102726
DO - 10.1016/j.cose.2022.102726
M3 - 文章
AN - SCOPUS:85135382600
SN - 0167-4048
VL - 118
JO - Computers and Security
JF - Computers and Security
M1 - 102726
ER -