Multimodal affect recognition in spontaneous HCI environment

Signal Processing, Communication and Computing(2012)

引用 16|浏览8
暂无评分
摘要
Human Computer Interaction (HCI) is known to be a multimodal process. In this paper we will show results of experiments for affect recognition, with non-acted, affective multimodal data from the new Last Minute Corpus (LMC). This corpus is more related to real HCI applications than other known data sets where affective behavior is elicited untypically for HCI.We utilize features from three modalities: facial expressions, prosody and gesture. The results show, that even simple fusion architectures can reach respectable results compared to other approaches. Further we could show, that probably not all features and modalities contribute substantially to the classification process, where prosody and eye blink frequency seem most contributing in the analyzed dataset.
更多
查看译文
关键词
gesture recognition,human computer interaction,affective multimodal data,classification process,eye blink frequency,facial expression,last minute corpus,multimodal affect recognition,nonacted multimodal data,prosody recognition,spontaneous hci environment,affect recognition,hci,multimodal
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要