Classification Of Five Emotions From Eeg And Eye Movement Signals: Complementary Representation Properties

2019 9TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER)(2019)

引用 23|浏览45
暂无评分
摘要
Recently, various multimodal approaches to enhancing the performance of affective models have been developed. In this paper, we investigate the complementary representation properties of EEG and eye movement signals on classification for five human emotions: happy, sad, fear, disgust, and neutral. We compare the performance of single modality and two different modality fusion approaches. The results indicate that EEG is superior to eye movements in classifying happy, sad and disgust emotions, whereas eye movements outperform EEG in recognizing fear and neutral emotions. Compared with eye movements, EEG has the advantage of classifying the five emotions, with the mean accuracies of 69.50% and 59.81%, respectively. Due to the complementary representation properties, the modality fusion with bimodal deep auto-encoder significantly improves the classification accuracy to 79.71%. Furthermore, we study the neural patterns of five emotion states and the recognition performance of different eye movement features. The results reveal that five emotions have distinguishable neural patterns and pupil diameter has a relatively high discrimination ability than the other eye movement features.
更多
查看译文
关键词
disgust emotions,eye movements outperform EEG,neutral emotions,complementary representation properties,classification accuracy,emotion states,recognition performance,multimodal approaches,eye movement signals,human emotions,single modality,modality fusion approaches,eye movement features
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要