TY - GEN
T1 - Masked Faces with Faced Masks
AU - Zhu, Jiayi
AU - Guo, Qing
AU - Juefei-Xu, Felix
AU - Huang, Yihao
AU - Liu, Yang
AU - Pu, Geguang
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023.
PY - 2023
Y1 - 2023
N2 - Modern face recognition systems (FRS) still fall short when the subjects are wearing facial masks. An intuitive partial remedy is to add a mask detector to flag any masked faces so that the FRS can act accordingly for those low-confidence masked faces. In this work, we set out to investigate the potential vulnerability of such FRS equipped with a mask detector, on large-scale masked faces, which might trigger a serious risk, e.g., letting a suspect evade the facial identity from FRS and not detected by mask detectors simultaneously. We formulate the new task as the generation of realistic & adversarial-faced mask and make three main contributions: First, we study the naive Delaunay-based masking method (DM) to simulate the process of wearing a faced mask, which reveals the main challenges of this new task. Second, we further equip the DM with the adversarial noise attack and propose the adversarial noise Delaunay-based masking method (AdvNoise-DM) that can fool the face recognition and mask detection effectively but make the face less natural. Third, we propose the adversarial filtering Delaunay-based masking method denoted as MF2M by employing the adversarial filtering for AdvNoise-DM and obtain more natural faces. With the above efforts, the final version not only leads to significant performance deterioration of the state-of-the-art (SOTA) deep learning-based FRS, but also remains undetected by the SOTA facial mask detector simultaneously.
AB - Modern face recognition systems (FRS) still fall short when the subjects are wearing facial masks. An intuitive partial remedy is to add a mask detector to flag any masked faces so that the FRS can act accordingly for those low-confidence masked faces. In this work, we set out to investigate the potential vulnerability of such FRS equipped with a mask detector, on large-scale masked faces, which might trigger a serious risk, e.g., letting a suspect evade the facial identity from FRS and not detected by mask detectors simultaneously. We formulate the new task as the generation of realistic & adversarial-faced mask and make three main contributions: First, we study the naive Delaunay-based masking method (DM) to simulate the process of wearing a faced mask, which reveals the main challenges of this new task. Second, we further equip the DM with the adversarial noise attack and propose the adversarial noise Delaunay-based masking method (AdvNoise-DM) that can fool the face recognition and mask detection effectively but make the face less natural. Third, we propose the adversarial filtering Delaunay-based masking method denoted as MF2M by employing the adversarial filtering for AdvNoise-DM and obtain more natural faces. With the above efforts, the final version not only leads to significant performance deterioration of the state-of-the-art (SOTA) deep learning-based FRS, but also remains undetected by the SOTA facial mask detector simultaneously.
KW - Adversarial attack
KW - Face recognition
KW - Mask detection
UR - https://www.scopus.com/pages/publications/105009471120
U2 - 10.1007/978-3-031-25056-9_24
DO - 10.1007/978-3-031-25056-9_24
M3 - 会议稿件
AN - SCOPUS:105009471120
SN - 9783031250552
T3 - Lecture Notes in Computer Science
SP - 360
EP - 377
BT - Computer Vision - ECCV 2022 Workshops, Proceedings
A2 - Karlinsky, Leonid
A2 - Michaeli, Tomer
A2 - Nishino, Ko
PB - Springer Science and Business Media Deutschland GmbH
T2 - Workshops held at the 17th European Conference on Computer Vision, ECCV 2022
Y2 - 23 October 2022 through 27 October 2022
ER -