TY - GEN
T1 - TDAD
T2 - 31st IEEE International Conference on Image Processing, ICIP 2024
AU - Hu, Wenrui
AU - Xie, Yuan
AU - Yu, Wei
N1 - Publisher Copyright:
© 2024 IEEE
PY - 2024
Y1 - 2024
N2 - The problem of overgeneralization is widespread in unsupervised anomaly detection methods, especially those that rely on knowledge distillation techniques. This problem arises because the student network has a strong tendency to mimic its teacher, even for unseen anomaly patterns, resulting in erroneous predictions. To tackle this issue, we have developed Trident Distillation Anomaly Detection (TDAD), which uses a trident distillation process in a self-supervised masked training paradigm. TDAD incorporates synthetic anomalies and seamlessly blends general knowledge distillation (GKD) with novel self-consistency distillation (SCD) and discrepancy maximization distillation (DMD) techniques. The synergistic optimization of these components widens the gap between abnormal feature distributions in the teacher and student domains, while maintaining coherence within the normal distributions, thereby enhancing prediction reliability. Extensive experiments conducted on the MVTec dataset demonstrate that TDAD effectively mitigates overgeneralization, achieving superior anomaly detection performance compared to its competitors.
AB - The problem of overgeneralization is widespread in unsupervised anomaly detection methods, especially those that rely on knowledge distillation techniques. This problem arises because the student network has a strong tendency to mimic its teacher, even for unseen anomaly patterns, resulting in erroneous predictions. To tackle this issue, we have developed Trident Distillation Anomaly Detection (TDAD), which uses a trident distillation process in a self-supervised masked training paradigm. TDAD incorporates synthetic anomalies and seamlessly blends general knowledge distillation (GKD) with novel self-consistency distillation (SCD) and discrepancy maximization distillation (DMD) techniques. The synergistic optimization of these components widens the gap between abnormal feature distributions in the teacher and student domains, while maintaining coherence within the normal distributions, thereby enhancing prediction reliability. Extensive experiments conducted on the MVTec dataset demonstrate that TDAD effectively mitigates overgeneralization, achieving superior anomaly detection performance compared to its competitors.
KW - Unsupervised anomaly detection
KW - knowledge distillation
KW - self-supervised learning
UR - https://www.scopus.com/pages/publications/85216903805
U2 - 10.1109/ICIP51287.2024.10647330
DO - 10.1109/ICIP51287.2024.10647330
M3 - 会议稿件
AN - SCOPUS:85216903805
T3 - Proceedings - International Conference on Image Processing, ICIP
SP - 346
EP - 352
BT - 2024 IEEE International Conference on Image Processing, ICIP 2024 - Proceedings
PB - IEEE Computer Society
Y2 - 27 October 2024 through 30 October 2024
ER -