Robust Hand Gestural Interaction for Smartphone Based AR/VR Applications

2017 IEEE Winter Conference on Applications of Computer Vision (WACV)(2017)

引用 15|浏览35
暂无评分
摘要
The future of user interfaces will be dominated by hand gestures. In this paper, we explore an intuitive hand gesture based interaction for smartphones having a limited computational capability. To this end, we present an efficient algorithm for gesture recognition with First Person View (FPV), which focuses on recognizing a four swipe model (Left, Right, Up and Down) for smartphones through single monocular camera vision. This can be used with frugal AR/VR devices such as Google Cardboard 1 andWearality 2 in building AR/VR based automation systems for large scale deployments, by providing a touch-less interface and real-time performance. We take into account multiple cues including palm color, hand contour segmentation, and motion tracking, which effectively deals with FPV constraints put forward by a wearable. We also provide comparisons of swipe detection with the existing methods under the same limitations. We demonstrate that our method outperforms both in terms of gesture recognition accuracy and computational time.
更多
查看译文
关键词
robust hand gestural interaction,Smartphone,AR/VR applications,user interfaces,computational capability,gesture recognition,first person view,FPV,camera vision,touch less interface,real-time performance,palm color,hand contour segmentation,motion tracking
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要