Distilling object detectors with mask-guided feature and relation-based knowledge.

Int. J. Comput. Sci. Eng.(2024)

引用 0|浏览2
暂无评分
摘要
Knowledge distillation (KD) is an effective technique for network compression and model accuracy enhancement in image classification, semantic segmentation, pre-trained language model, and so on. However, existing KD methods are specialised for image classification and cannot be used effectively for object detection tasks, with the following two limitations: the imbalance of foreground and background instances and the neglect distillation of relation-based knowledge. In this paper, we present a general mask-guided feature and relation-based knowledge distillation framework (MAR) consisting of two components, mask-guided distillation, and relation-based distillation, to address the above problems. The mask-guided distillation is designed to emphasise students' learning of close-to-object features via multi-value masks, while relation-based distillation is proposed to mimic the relational information between different feature pixels on the classification head. Extensive experiments show that our methods achieve excellent AP improvements on both one-stage and two-stage detectors. Specifically, faster R-CNN with ResNet50 backbone achieves 40.6% in mAP under 1 × schedule on the COCO dataset, which is 3.2% higher than the baseline and even surpasses the teacher detector.
更多
查看译文
关键词
knowledge distillation,multi-value mask,object detection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要