Exploring Eye-Tracking-Driven Sonification for the Visually Impaired.

AH(2016)

引用 7|浏览31
暂无评分
摘要
Most existing sonification approaches for the visually impaired restrict the user to the perception of static scenes by performing sequential scans and transformations of visual information to acoustic signals. This takes away the user's freedom to explore the environment and to decide which information is relevant at a given point in time. As a solution, we propose an eye tracking system to allow the user to choose which elements of the field of view should be sonified. More specifically, we enhance the sonification approaches for color, text and facial expressions with eye tracking mechanisms. To find out how visually impaired people might react to such a system we applied a user centered design approach. Finally, we explored the effectiveness of our concept in a user study with seven visually impaired persons. The results show that eye tracking is a very promising input method to control the sonification, but the large variety of visual impairment conditions restricts the applicability of the technology.
更多
查看译文
关键词
Sonification, Eye Tracking, Visually Impaired, Sound Synthesis, Signal Processing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要