Asynchronous Motor Imagery BCI and LiDAR-Based Shared Control System for Intuitive Wheelchair Navigation

IEEE SENSORS JOURNAL(2023)

引用 0|浏览4
暂无评分
摘要
Mapping drivers' thoughts directly to mobility system control would make driving more intuitive as if the mobility system is an extension of their own body. Such a system would allow patients with motor disabilities to drive, as it would not require any physical movement. In this article, we therefore propose a brain-controlled mobility system that analyzes real-time neural signals elicited from motor imagery, an imagination of different body movements. As such asynchronous brain-computer interfaces (BCIs) are prone to error, our system contains shared control capabilities that take into consideration continuously updated information of the surrounding environment along with electroencephalogram (EEG) signals to improve navigating performance without precise and accurate control from the driver. With our shared control method that uses a wheelchair with light detection and ranging (LiDAR) and inertial measurement unit (IMU) sensors, we held a comparative study in which participants drove our wheelchair with and without our shared control approach using either our brain-controlled system or a keyboard in a physical environment. The experimental results show that among the five participants, the three participants that failed the driving task with the asynchronous BCI-based system could also successfully complete it using our shared control approach. Furthermore, our approach narrows the gap between driving with neural signals and driving with a widely used interface in terms of both elapsed time and safety. These results show not only the potential of brain signals for driving but also the applicability of BCIs to real-life situations. [GRAPHICS]
更多
查看译文
关键词
Brain-computer interface (BCI),electroencephalogram (EEG),light detection and ranging (LiDAR),motor imagery,shared control
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要