Detecting Engagement in Egocentric Video

Lecture Notes in Computer Science(2016)

引用 64|浏览77
暂无评分
摘要
In a wearable camera video, we see what the camera wearer sees. While this makes it easy to know roughly what he chose to look at, it does not immediately reveal when he was engaged with the environment. Specifically, at what moments did his focus linger, as he paused to gather more information about something he saw? Knowing this answer would benefit various applications in video summarization and augmented reality, yet prior work focuses solely on the "what" question (estimating saliency, gaze) without considering the "when" (engagement). We propose a learning-based approach that uses long-term egomotion cues to detect engagement, specifically in browsing scenarios where one frequently takes in new visual information (e.g., shopping, touring). We introduce a large, richly annotated dataset for ego-engagement that is the first of its kind. Our approach outperforms a wide array of existing methods. We show engagement can be detected well independent of both scene appearance and the camera wearer's identity.
更多
查看译文
关键词
egocentric video
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要