Vtrack: Virtual Trackpad Interface Using Mm-Level Sound Source Localization For Mobile Interaction

UbiComp '16: The 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing Heidelberg Germany September, 2016(2016)

引用 11|浏览46
暂无评分
摘要
Touchscreens on mobile devices allow intuitive interactions through haptic communication, but their limited workspace confines user experiences. In this extended abstract, we introduce vTrack, a virtual trackpad interface that tracks user input on any surface near the mobile device and extends the range of interaction over the touchscreen. The system adopts the acoustic signal as the main medium used for the interaction, which can be handled without expensive sensors or additional resources on mobile devices. By leveraging a multi-channel microphone array on the receiving device, we build a fingerprint-based localization model using various cues, such as time difference of arrival, angle of arrival, and spectral power level of the audio signal. The technique integrates the frequency difference of arrival incurred by the Doppler shift to track the sound source in motion. Our experiments show that vTrack achieves millimeter-level granularity: 1.5mm of localization error on average in moving sound source scenario.
更多
查看译文
关键词
Mobile Systems,Interaction Interface,Sound Source Localization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要