Real-time classification of evoked emotions using facial feature tracking and physiological responses

International Journal of Human-Computer Studies(2008)

引用 360|浏览0
暂无评分
摘要
We present automated, real-time models built with machine learning algorithms which use videotapes of subjects' faces in conjunction with physiological measurements to predict rated emotion (trained coders' second-by-second assessments of sadness or amusement). Input consisted of videotapes of 41 subjects watching emotionally evocative films along with measures of their cardiovascular activity, somatic activity, and electrodermal responding. We built algorithms based on extracted points from the subjects' faces as well as their physiological responses. Strengths of the current approach are (1) we are assessing real behavior of subjects watching emotional videos instead of actors making facial poses, (2) the training data allow us to predict both emotion type (amusement versus sadness) as well as the intensity level of each emotion, (3) we provide a direct comparison between person-specific, gender-specific, and general models. Results demonstrated good fits for the models overall, with better performance for emotion categories than for emotion intensity, for amusement ratings than sadness ratings, for a full model using both physiological measures and facial tracking than for either cue alone, and for person-specific models than for gender-specific or general models.
更多
查看译文
关键词
general model,cardiovascular activity,evoked emotion,physiological measure,physiological measurement,amusement rating,real-time classification,emotion category,emotion type,sadness rating,emotion intensity,physiological response,facial feature tracking,affective computing,machine learning,emotion,computer vision,real time
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要