HaptMR: Smart Haptic Feedback for Mixed Reality Based on Computer Vision Semantic.

HCI (9)(2021)

引用 1|浏览4
暂无评分
摘要
This paper focuses on tactile feedback based on semantic analysis using deep learning algorithms on the mobile Mixed Reality (MR) device, called HaptMR. This way, we improve MR’s immersive experience and reach a better interaction between the user and real/virtual objects. Based on the Mixed Reality device HoloLens 2. generation (HL2), we achieve a haptic feedback system that utilizes the hand tracking system on HL2 and fine haptic modules on hands. Furthermore, we adapt the deep learning model – Inception V3 to recognize the rigidity of objects. According to the scenes’ semantic analysis, when users make gestures or actions, their hands can receive force feedback similar to the real haptic sense. We conduct a within-subject user study to test the feasibility and usability of HaptMR. In user study, we design two tasks, including hand tracking and spatial awareness, and then, evaluate the objective interaction experience (Interaction Accuracy, Algorithm Accuracy, Temporal Efficiency) and the subjective MR experience (Intuitiveness, Engagement, Satisfaction). After visualizing results and analyzing the user study, we conclude that the HaptMR system improves the immersive experience in MR. With HaptMR, we could fill users’ sense of inauthenticity produced by MR. HaptMR could build applications on industrial usage, spatial anchor, virtual barrier, 3D semantic interpretation, and as a foundation of other implementations.
更多
查看译文
关键词
Mixed reality, Interacted force feedback, Deep learning for computer vision
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要