Exploring Eye-Gaze Wheelchair Control

ETRA '20: 2020 Symposium on Eye Tracking Research and Applications Stuttgart Germany June, 2020(2020)

引用 12|浏览23
暂无评分
摘要
Eye-gaze may potentially be used for steering wheelchairs or robots and thereby support independence in choosing where to move. This paper investigates the feasibility of gaze-controlled interfaces. We present an experiment with wheelchair control in a simulated, virtual reality (VR) driving experiment and a field study with five people using wheelchairs. In the VR experiment, three control interfaces were tested by 18 able-bodied subjects: (i) dwell buttons for direction commands on an overlay display, (ii) steering by continuous gaze point assessment on the ground plane in front of the driver, and (iii) waypoint navigation to targets placed on the ground plane. Results indicate that the waypoint method had superior performance, and it was also most preferred by the users, closely followed by the continuous-control interface. However, the field study revealed that our wheelchair users felt uncomfortable and excluded when they had to look down at the floor to steer a vehicle. Hence, our VR testing had a simplified representation of the steering task and ignored an important part of the use-context. In the discussion, we suggest potential improvements of simulation-based design of wheelchair gaze control interfaces.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要