Estimation of FAPs and intensities of AUs based on real-time face tracking

FAA '12: Proceedings of the 3rd Symposium on Facial Analysis and Animation(2012)

引用 6|浏览0
暂无评分
摘要
Imitation of natural facial behavior in real-time is still challenging when it comes to natural behavior such as laughter and nonverbal expressions. This paper explains our ongoing work on methodologies and tools for estimating Facial Animation Parameters (FAPs) and intensities of Action Units (AUs) in order to imitate lifelike facial expressions with an MPEG-4 complaint Embodied Conversational Agent (ECA) -- The GRETA agent (Bevacqua et al. 2007). Firstly, we investigate available open source tools for better facial landmark localization. Secondly, FAPs and intensities of AUs are estimated based on facial landmarks computed with an open source face tracker tool. Finally, the paper discusses our ongoing work to investigate better re-synthesis technology among FAP-based and AU-based synthesis technologies using perceptual studies on: (i) naturalness in synthesized facial expressions; (ii) similarity perceived by the subjects when compared to original user's behavior.
更多
查看译文
关键词
au-based synthesis technology,synthesized facial expression,natural facial behavior,facial landmark,better facial landmark localization,real-time face tracking,lifelike facial expression,available open source tool,natural behavior,ongoing work,open source face tracker
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要