Neural correlates of affective content: application to perceptual tagging of video

NEURAL COMPUTING & APPLICATIONS(2021)

引用 3|浏览12
暂无评分
摘要
Over the past years, a digital multimedia uprising has been experienced in every walk of life, due to which the un-annotated or unstructured multimedia content has always been a key issue for research. The multimedia content is usually created with some intended emotions, which the creator wants to induce in viewers. The affectiveness of the multimedia content can be measured by analyzing elicited emotions of its viewers. In this paper, we present a rigorous study of human cognition using EEG signals while watching a video, to analyze the affectiveness of video content. The analysis presented in this paper is done to establish an effective relationship between video content and the human emotional state. For this, the most effective scalp location and frequency ranges are identified for two categories of videos, i.e., excited and sad. Furthermore, a common affective response (CAR) is extracted for finding the distinguishable features for aforementioned categories of videos. The CAR is calculated and tested on the publicly available dataset “AMIGOS,” and the results presented here show the utility of cognitive features on extracted scalp locations and frequency ranges for automatic tagging of video content. The current research explores the innovative applicability of neuro-signals for a mouse-free video tagging based on human excitement level to augment a range of brain–computer interface (BCI)-based devices. It can further aid to automatically retrieve the video content which is exciting and interesting to human viewers. With this analysis, we aimed to provide a thorough analysis which can be used to customize a low-cost and mobile EEG system for automatic analysis and retrieval of videos.
更多
查看译文
关键词
Affective content, Video tagging, EEG signals, Brain-computer interface (BCI)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要