Feature-Domain Adaptive Contrastive Distillation for Efficient Single Image Super-Resolution

IEEE ACCESS(2023)

引用 0|浏览2
暂无评分
摘要
Convolutional neural network-based single image super-resolution (SISR) involves numerous parameters and high computational expenses to ensure improved performance, limiting its applicability in resource-constrained devices such as mobile phones. Knowledge distillation (KD), which transfers useful knowledge from a teacher network to a student network, has been investigated as a method to make networks more efficient in terms of performance. To this end, feature distillation (FD) has been utilized in KD to minimize the Euclidean distance-based loss of feature maps between teacher and student networks. However, this technique does not adequately consider the effective and meaningful delivery of knowledge from the teacher to the student network to improve the latter's performance under given network capacity constraints. In this study, we propose a feature-domain adaptive contrastive distillation (FACD) method to train lightweight student SISR networks efficiently. We highlight the limitations of existing FD methods in terms of Euclidean distance-based loss, and propose a feature-domain contrastive loss, which causes student networks to learn richer information from the teacher's representation in the feature domain. We also implement adaptive distillation that performs distillation selectively depending on the conditions of the training patches. Experimental results demonstrated that the proposed FACD scheme improves student enhanced deep residual networks and residual channel attention networks not only in terms of the peak signal-to-noise ratio (PSNR) on all benchmark datasets and scales but also in terms of subjective image quality, compared to the conventional FD approaches. In particular, FACD achieved an average PSNR improvement of 0.07 dB over conventional FD in both networks. Code will be release at https://github.com/hcmoon0613/FACD.
更多
查看译文
关键词
Training,Superresolution,Performance evaluation,Task analysis,Moon,Knowledge engineering,Residual neural networks,Knowledge transfer,Contrastive learning,efficient super-resolution,feature distillation,knowledge distillation,single image super-resolution
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要