PEM360: A dataset of 360 videos with continuous Physiological measurements, subjective Emotional ratings and Motion traces

PROCEEDINGS OF THE ACM INTERNATIONAL CONFERENCE ON INTERACTIVE MEDIA EXPERIENCES WORKSHOPS, IMXW 2023(2023)

引用 2|浏览8
暂无评分
摘要
From a user perspective, immersive content can elicit more intense emotions than flat-screen presentations. From a system perspective, efficient storage and distribution remain challenging, and must consider user attention. Understanding the connection between user attention, user emotions and immersive content is therefore key. In this work, we will present the PEM360 dataset of user head movements and gaze recordings in 360 degrees videos, along with self-reported emotional ratings of valence and arousal, and continuous physiological measurement of electrodermal activity and heart rate. The stimuli are selected to enable the spatiotemporal analysis of the connection between content, user motion and emotion. We then present findings on the tri-partite connection between user attention, user emotion and visual content in immersive environments. This involves analyzing low-level and high-level saliency of video content, in connection with the data on the user's state from the PEM360 dataset. This work was published and presented at ACM MMSys 2022 Open Dataset and Software track[2] and at ICIP 2022 [1].
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要