Exploring Multimodal Biosignal Features For Stress Detection During Indoor Mobility

ICMI-MLMI(2016)

引用 43|浏览63
暂无评分
摘要
This paper presents a multimodal framework for assessing the emotional and cognitive experience of blind and visually impaired people when navigating in unfamiliar indoor environments based on mobile monitoring and fusion of electroencephalography (EEG) and electrodermal activity (EDA) signals. The overall goal is to understand which environmental factors increase stress and cognitive load in order to help design emotionally intelligent mobility technologies that are able to adapt to stressful environments from real-time biosensor data. We propose a model based on a random forest classifier which successfully infers in an automatic way (weighted AUROC 79.3%) the correct environment among five predefined categories expressing generic everyday situations of varying complexity and difficulty, where different levels of stress are likely to occur. Time-locating the most predictive multimodal features that relate to cognitive load and stress, we provide further insights into the relationship of specific biomarkers with the environmental/situational factors that evoked them.
更多
查看译文
关键词
Multimodal classification,data fusion,multimodal interaction,biosignals,mobile EEG,electrodermal activity,affective computing,visually impaired mobility
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要