A.Eye Drive: Gaze-based semi-autonomous wheelchair interface

2019 41ST ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC)(2019)

引用 15|浏览10
暂无评分
摘要
Existing wheelchair control interfaces, such as sip & puff or screen based gaze-controlled cursors, are challenging for the severely disabled to navigate safely and independently as users continuously need to interact with an interface during navigation. This puts a significant cognitive load on users and prevents them from interacting with the environment in other forms during navigation. We have combined eyetracking/gaze-contingent intention decoding with computer vision context-aware algorithms and autonomous navigation drawn from self-driving vehicles to allow paralysed users to drive by eye, simply by decoding natural gaze about where the user wants to go: A.Eye Drive. Our "Zero UI" driving platform allows users to look and interact visually with at an object or destination of interest in their visual scene, and the wheelchair autonomously takes the user to the intended destination, while continuously updating the computed path for static and dynamic obstacles. This intention decoding technology empowers the end-user by promising more independence through their own agency.
更多
查看译文
关键词
Algorithms,Disabled Persons,Equipment Design,Fixation, Ocular,Humans,Robotics,User-Computer Interface,Wheelchairs
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要