Embedding backdoors as the facial features: Invisible backdoor attacks against face recognition systems

Can He, Mingfu Xue, Jian Wang, Weiqiang Liu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Scopus citations

Abstract

Deep neural network (DNN) based face recognition systems have been widely applied in various identity authentication scenarios. However, recent studies show that the DNN models are vulnerable to backdoor attacks. An attacker can embed backdoors into the neural network by modifying its internal structure or poisoning the training set. In this way, the attacker can login into the system as the victim, while the normal use of the system by legitimate users will not be affected. However, the backdoors used in existing attacks are visually perceptible (black-frame glasses or purple sunglasses), which will arouse humans' suspicions thus lead to the failure of the attacks. In this paper, we propose a novel backdoor attack method, BHF2 (Backdoor Hidden as Facial Features), where the attacker can embed the backdoors as the inherent facial features. The proposed method can greatly enhance the concealment of the injected backdoor, which makes the backdoor attacks more difficult to be discovered. Besides, the BHF2 method can be launched under the black-box conditions, where the attacker is completely unaware of the target face recognition system. The proposed backdoor attack method can be applied in those rigorous identity authentication scenarios where the users are not allowed to wear any accessories. Experimental results show that the BHF2 method can achieve high attack success rate (up to 100%) on the state-of-the-art face recognition model, DeepID1, while the normal working performance of the system has hardly been affected (the recognition accuracy of the system has only dropped by 0.01% at the lowest).

Original languageEnglish
Title of host publicationACM TURC 2020 - Proceedings of ACM Turing Celebration Conference - China
PublisherAssociation for Computing Machinery
Pages231-235
Number of pages5
ISBN (Electronic)9781450375344
DOIs
StatePublished - 22 May 2020
Externally publishedYes
Event2020 ACM Turing Celebration Conference - China, ACM TURC 2020 - Hefei, China
Duration: 21 May 202123 May 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference2020 ACM Turing Celebration Conference - China, ACM TURC 2020
Country/TerritoryChina
CityHefei
Period21/05/2123/05/21

Keywords

  • Artificial intelligence security
  • Backdoor attacks
  • Deep learning
  • Face recognition systems

Fingerprint

Dive into the research topics of 'Embedding backdoors as the facial features: Invisible backdoor attacks against face recognition systems'. Together they form a unique fingerprint.

Cite this