Cyber–Physical Mobile Arm Gesture Recognition using Ultrasound and Motion Data

2020 IEEE Conference on Industrial Cyberphysical Systems (ICPS)(2020)

引用 2|浏览2
暂无评分
摘要
Alternative and more intuitive input methods for the control of consumer devices (e.g. tablets, smart-phones and home automation systems), vehicle entertainment systems and industrial machines and robots are on the rise. In this contribution, we propose a body worn setup, which supplements the omnipresent 3 DoF motion sensors with a set of ultrasound transceivers. This allows for an active time–of–flight based measurement of distances between independent nodes on upper limbs and torso without requiring calibration or remote reference sensors like video cameras. A body worn demonstrator is introduced, which consists of three independent but synchronized nodes, equipped with motion sensors and Capacitive Micromachined Ultrasonic Transceivers. Results of an initial test series using eight wide arm gestures show, that combining ultrasound distance measurements and motion data improves the classification accuracy.
更多
查看译文
关键词
Gesture recognition,pattern recognition,sensor fusion,Wearables,Human Machine Interface
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要