Measuring Social Modulation of Gaze in Autism Spectrum Condition With Virtual Reality Interviews

Saygin Artiran, Raghav Ravisankar, Sarah Luo,Leanne Chukoskie,Pamela Cosman

IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING(2022)

引用 2|浏览7
暂无评分
摘要
Gaze behavior in dyadic conversations can indicate active listening and attention. However, gaze behavior that is different from the engagement expected during neurotypical social interaction cues may be interpreted as uninterested or inattentive, which can be problematic in both personal and professional situations. Neurodivergent individuals, such as those with autism spectrum conditions, often exhibit social communication differences broadly including via gaze behavior. This project aims to support situational social gaze practice through a virtual reality (VR) mock job interview practice using the HTC Vive Pro Eye VR headset. We show how gaze behavior varies in the mock job interview between neurodivergent and neurotypical participants. We also investigate the social modulation of gaze behavior based on conversational role (speaking and listening). Our three main contributions are: (i) a system for fully-automatic analysis of social modulation of gaze behavior using a portable VR headset with a novel realistic mock job interview, (ii) a signal processing pipeline, which employs Kalman filtering and spatial-temporal density-based clustering techniques, that can improve the accuracy of the headset's built-in eye-tracker, and (iii) being the first to investigate social modulation of gaze behavior among neurotypical/divergent individuals in the realm of immersive VR.
更多
查看译文
关键词
Behavioral sciences,Interviews,Modulation,Headphones,Autism,Oral communication,Signal processing algorithms,Gaze behavior,job interview practice,signal processing,neurodivergence,virtual reality,social modulation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要