Autoencoder-Like Knowledge Distillation Network for Anomaly Detection

IEEE Access(2023)

引用 0|浏览13
暂无评分
摘要
Anomaly detection is a crucial research field in computer vision with diverse applications in practical scenarios. The common anomaly detection methods employed currently consist of autoencoders, generative adversarial networks, and knowledge distillation (KD) models. However, the teacher and student models in KD might not always yield distinct representations to signify anomalies due to their similar model structure and data flow. This study proposes a novel autoencoder-like KD model based on the attention mechanism for anomaly detection. The pre-trained teacher model incorporates a dual attention module as the encoder, while the student model integrates the same dual attention module as the decoder. The teacher guides the student to learn the feature knowledge of the input image. To connect the teacher-student model, a BottleNeck module is employed, converting the features extracted from the teacher model into more compact latent codes for precise restoration by the student model, thereby achieving anomaly detection. In general, the proposed model exhibits superior performance compared to other existing anomaly detection models on specific datasets. Experimental results demonstrate that the proposed model attains the state-of-the-art (SOTA) performance in anomaly detection on the public dataset MVTec. It achieves an average AUC of 98.2% and 98.0% at sample and pixel levels, respectively.
更多
查看译文
关键词
Anomaly detection, dual attention, knowledge distillation, teacher-student
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要