Multiple Teacher Knowledge Distillation for Head Pose Estimation Without Keypoints

SN Computer Science(2023)

引用 0|浏览0
暂无评分
摘要
In recent years, human head pose estimation has played a significant role in facial analysis with a variety of practical applications such as gaze estimation, virtual reality, driver assistance, etc. Due to its importance, in this paper, we propose a lightweight model to effectively deal with the task of head pose estimation. Firstly, the teacher models is trained on the synthesis dataset 300W-LPA to obtain the head pose pseudo labels; before an architecture with ResNet18 backbone is adopted and trained with the ensemble of these pseudo labels via the knowledge distillation process. Real-world head pose datasets AFLW-2000 and BIWI are used to evaluate our proposed approach efficacy. Experimental results prove the significant improvement of our proposed approach in the testing accuracy in comparison with other state-of-the-art head pose estimation methods. Furthermore, our model has the real-time speed of ∼ 300 FPS when inferring on Tesla V100. Source code and pre-trained weight are available at github.com/chientv99/headpose .
更多
查看译文
关键词
Head pose estimation,Knowledge distillation,Facial analysis,Convolutional neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要