Improving Face Recognition from Hard Samples via Distribution Distillation Loss

European Conference on Computer Vision(2020)

引用 55|浏览123
暂无评分
摘要
Large facial variations are the main challenge in face recognition. To this end, previous variation-specific methods make full use of task-related prior to design special network losses, which are typically not general among different tasks and scenarios. In contrast, the existing generic methods focus on improving the feature discriminability to minimize the intra-class distance while maximizing the inter-class distance, which perform well on easy samples but fail on hard samples. To improve the performance on hard samples, we propose a novel Distribution Distillation Loss to narrow the performance gap between easy and hard samples, which is simple, effective and generic for various types of facial variations. Specifically, we first adopt state-of-the-art classifiers such as Arcface to construct two similarity distributions: a teacher distribution from easy samples and a student distribution from hard samples. Then, we propose a novel distribution-driven loss to constrain the student distribution to approximate the teacher distribution, which thus leads to smaller overlap between the positive and negative pairs in the student distribution. We have conducted extensive experiments on both generic large-scale face benchmarks and benchmarks with diverse variations on race, resolution and pose. The quantitative results demonstrate the superiority of our method over strong baselines, e.g., Arcface and Cosface. Code will be available at https://github.com/HuangYG123/DDL.
更多
查看译文
关键词
Face recognition,Loss function,Distribution distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要