Multi-Target Cross-Dataset Palmprint Recognition via Distilling From Multi-Teacher.

IEEE Trans. Instrum. Meas.(2023)

引用 0|浏览3
暂无评分
摘要
Cross-dataset palmprint recognition is an important and popular topic, which has attracted more and more attention. In previous study, researchers are mainly focused on the scenarios of single-target or multisource cross-dataset recognition. However, in practical application, the query images may be collected from multiple devices and environment, called multi-target cross-dataset palmprint recognition, which is much more challenging. In this article, an approach is presented for multi-target cross-dataset palmprint recognition using knowledge distillation and domain adaptation, named distilling from multi-teacher (DFMT). The source dataset is first paired with each of the multiple target datasets. Then, a teacher feature extractor is constructed to extract the adaptive knowledge of each pair using domain adaptation. Then, a student feature extractor is established to learn the adaptive knowledge from teacher feature extractors. Particularly, multilevel distillation losses are constructed to help transfer the adaptive knowledge more effectively. Experiments are carried out on multiple palmprint databases such as contact database, contact-less database, constrained database, and unconstrained database. Experimental results demonstrate the superiority of DFMT compared with other competitive algorithms.
更多
查看译文
关键词
Domain adaptation, knowledge distillation, palmprint recognition, transfer learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要