Dual knowledge distillation for visual tracking with teacher–student network

Yuanyun Wang, Chuanyu Sun,Jun Wang, Bingfei Chai

Signal, Image and Video Processing(2024)

引用 0|浏览0
暂无评分
摘要
Knowledge distillation techniques have been successfully used in computer vision in recent years. The two main distillation techniques now in use are logit-based distillation and middle layer feature-based distillation. Logit-based distillation does not handle the features of the middle layer of the teacher network well. The extraction of middle layer based features is mainly on local information. In this paper, we design a backbone network based on dual knowledge distillation. Dual knowledge distillation enables the student network to mimic the teacher predictions while learning the teacher middle layer features. A trained Resnet50 is used as a lightweight student network (feature extraction network) trained by the teacher network. Lightweight feature extraction networks can effectively reduce memory and improve tracking accuracy. Since different students have different comprehension abilities, in order to facilitate communication between student networks, we propose feature sharing between student networks in the template and search branches. Then, we propose a new tracking algorithm based on double knowledge extraction of backbone networks. Experimental results on five tracking datasets show that the tracking algorithm has good tracking performance and real-time tracking speed.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要