We Are Not All Equal: Personalizing Models For Facial Expression Analysis With Transductive Parameter Transfer
MM '14: 2014 ACM Multimedia Conference Orlando Florida USA November, 2014(2014)
摘要
Previous works on facial expression analysis have shown that person-specific models are advantageous with respect to generic ones for recognizing facial expressions of new users added to the gallery set. This finding is not surprising, due to the often significant inter-individual variability: different persons have different morphological aspects and express their emotions in different ways. However, acquiring person-specific labeled data for learning models is a very time consuming process. In this work we propose a new transfer learning method to compute personalized models without labeled target data. Our approach is based on learning multiple person-specific classifiers for a set of source subjects and then directly transfer knowledge about the parameters of these classifiers to the target individual. The transfer process is obtained by learning a regression function which maps the data distribution associated to each source subject to the corresponding classifier's parameters. We tested our approach on two different application domains, Action Units (AUs) detection and spontaneous pain recognition, using publicly available datasets and showing its advantages with respect to the state-of-the-art both in term of accuracy and computational cost.
更多查看译文
关键词
Facial Expression Recognition,Action Unit Detection,Transductive Transfer Learning,Learning from Distributions
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络