Crossmodal congruence: the look, feel and sound of touchscreen widgets.

ICMI-MLMI(2008)

引用 38|浏览17
暂无评分
摘要
ABSTRACTOur research considers the following question: how can visual, audio and tactile feedback be combined in a congruent manner for use with touchscreen graphical widgets? For example, if a touchscreen display presents different styles of visual buttons, what should each of those buttons feel and sound like? This paper presents the results of an experiment conducted to investigate methods of congruently combining visual and combined audio/tactile feedback by manipulating the different parameters of each modality. The results indicate trends with individual visual parameters such as shape, size and height being combined congruently with audio/tactile parameters such as texture, duration and different actuator technologies. We draw further on the experiment results using individual quality ratings to evaluate the perceived quality of our touchscreen buttons then reveal a correlation between perceived quality and crossmodal congruence. The results of this research will enable mobile touchscreen UI designers to create realistic, congruent buttons by selecting the most appropriate audio and tactile counterparts of visual button styles.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要