REDAD: A Reliable Distillation for Image Anomaly Detection

Wenrui Hu, Wei Yu, Yongqiang Tang, Yuan Xie, Wensheng Zhang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The vanilla knowledge distillation (KD)-based approach for image anomaly detection (AD) encounters three main challenges: data shift, normality forgetting, and anomaly overgeneralization. To effectively address these challenges, we introduce REDAD, a reliable distillation strategy for AD, which avoids intricate network designs. Initially, we incorporate an adaptive teacher model (ATM) that dynamically adjusts the teacher’s batch normalization (BN) statistics during the distillation process, aligning the teacher with the target training data distribution. Next, we introduce a normality remembering enhancement (NRE) module, designed to compel the student to learn the most challenging normal feature with high distillation loss, thereby bolstering normality retention and reducing false positives. Finally, we present a novel direction-guided regularization (DGR) technique to robustly enlarge the divergence in abnormal feature pairs, preventing the oversight of abnormal regions, or false negatives. Comprehensive experiments on the MVTec, VisA, and MVTec3D datasets show that REDAD effectively resolves these three concerns, achieving superior AD performance compared with its baseline model (exceeding 3.5% (5.0%, 6.1%) in I-AUROC, 2.3% (1.6%, 1.5%) in P-AUROC, and 3.2% (5.4%, 3.8%) in PRO). In addition, two real-world industrial product inspection applications further underscore the efficacy and utility of the proposed REDAD method.

Original languageEnglish
Article number5018713
JournalIEEE Transactions on Instrumentation and Measurement
Volume74
DOIs
StatePublished - 2025

Keywords

  • Knowledge distillation (KD)
  • self-supervised learning
  • unsupervised anomaly detection (UAD)

Fingerprint

Dive into the research topics of 'REDAD: A Reliable Distillation for Image Anomaly Detection'. Together they form a unique fingerprint.

Cite this