Supervised vs. Self-supervised Pre-trained models for Hand Pose Estimation.

ICTC(2022)

引用 0|浏览3
暂无评分
摘要
Fully-supervised learning and self-supervised learning are two standard learning frameworks for training visual representations. While the superiority and inferiority of the two frameworks are not obscured when pre-training is performed, this paper aims to compare the transferability performance for the hand posture estimation task. We conduct the experiment on a supervised pre-trained model and 5 self-supervised pre-trained models. To this end, we conclude that self-supervised pre-trained models do not necessarily outperform their supervised pre-trained counterparts, while self-supervised pre-trained models lead to faster convergence of the neural network.
更多
查看译文
关键词
hand,models,self-supervised,pre-trained
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要