Controlling Dependency: Selectively Resetting Channels for Pre-trained CNN Backbone Network on Hand Pose Estimation

ICTC(2022)

引用 0|浏览4
暂无评分
摘要
Transfer Learning is the de facto standard on numerous downstream tasks in computer vision. Previous works have shown diverse architectural approaches to enhance trans-ferability to transfer to downstream tasks. However, due to the complexity of the downstream tasks, the importance of a pre-trained feature extractor is sincerely overlooked. In this paper, we propose a simple method coined selective channel resetting(SCR), to enhance transfer learning performance for hand-pose estimation with the backbone network with the convolutional neural network. In this scheme, we apply a channel resetting module on the feature extractor to reset a portion of features to adjust the features to solely depend on the downstream task, and prevent the overfitting of the upstream task. The reset channels are then prepared to be trained freshly to fit the target task. Despite its simplicity, we can achieve +2.6% of AUC on the EgoDexter dataset. Empirical evidence demonstrates that the proposed method outperforms the conventional pretrain-finetune-based transfer learning on the hand pose estimation task. We also define the training stability to identify the benefits of pretraining.
更多
查看译文
关键词
Transfer Learning,Deep Learning,Hand Pose Estimation,Pre-training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要