Eliminating Backdoor Triggers for Deep Neural Networks Using Attention Relation Graph Distillation

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

21 Scopus citations

Abstract

Due to the prosperity of Artificial Intelligence (AI) techniques, more and more backdoor triggers are designed by adversaries to attack Deep Neural Networks (DNNs). Although the state-of-the-art method Neural Attention Distillation (NAD) can effectively erase backdoors from DNNs, it still suffers from non-negligible Attack Success Rate (ASR) together with lowered classification ACCuracy (ACC), since NAD focuses on backdoor defense using attention features (i.e., attention maps) of the same order. In this paper, we introduce a novel backdoor defense framework named Attention Relation Graph Distillation (ARGD), which fully explores the correlation among attention features with different orders using our proposed Attention Relation Graphs (ARGs). Based on the alignment of ARGs between teacher and student models during knowledge distillation, ARGD can more effectively eradicate backdoors than NAD. Comprehensive experimental results show that, against six latest backdoor attacks, ARGD outperforms NAD by up to 94.85% reduction in ASR, while ACC can be improved by up to 3.23%.

Original languageEnglish
Title of host publicationProceedings of the 31st International Joint Conference on Artificial Intelligence, IJCAI 2022
EditorsLuc De Raedt, Luc De Raedt
PublisherInternational Joint Conferences on Artificial Intelligence
Pages1481-1487
Number of pages7
ISBN (Electronic)9781956792003
DOIs
StatePublished - 2022
Event31st International Joint Conference on Artificial Intelligence, IJCAI 2022 - Vienna, Austria
Duration: 23 Jul 202229 Jul 2022

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
ISSN (Print)1045-0823

Conference

Conference31st International Joint Conference on Artificial Intelligence, IJCAI 2022
Country/TerritoryAustria
CityVienna
Period23/07/2229/07/22

Fingerprint

Dive into the research topics of 'Eliminating Backdoor Triggers for Deep Neural Networks Using Attention Relation Graph Distillation'. Together they form a unique fingerprint.

Cite this