Concurrent Crossmodal Feedback Assists Target-searching: Displaying Distance Information Through Visual, Auditory and Haptic Modalities

arxiv(2020)

引用 0|浏览7
暂无评分
摘要
Humans sense of distance depends on the integration of multi sensory cues. The incoming visual luminance, auditory pitch and tactile vibration could all contribute to the ability of distance judgement. This ability can be enhanced if the multimodal cues are associated in a congruent manner, a phenomenon has been referred to as Crossmodal correspondences. In the context of multi-sensory interaction, whether and how such correspondences influence information processing with continuous motor engagement, particularly for target searching activities, has rarely been investigated. This paper presents an experimental user study to address this question. We built a target-searching application based on a Table-top, displayed the unimodal and Crossmodal distance cues concurrently responding to peoples searching movement, measured task performance through kinematic evaluation. We find that the Crossmodal display an audio display lead to improved searching efficiency and accuracy. More interestingly, this improvement is confirmed by kinematic analysis, which also unveiled the underlying movement features that could account for this improvement. We discussed how these findings could shed lights on the design of assistive technology and of other multi sensory interaction.
更多
查看译文
关键词
haptic modalities,distance information,auditory,visual,target-searching
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要