vTrack: Envisioning a Virtual Trackpad Interface through mm-level Sound Source Localization for Mobile Interaction

MobiCASE(2016)

引用 1|浏览14
暂无评分
摘要
Touchscreens on mobile devices allow intuitive interactions through haptic communication, but their limited workspace confines user experiences. In this paper, we envision a virtual trackpad interface that tracks user input on any surface near the mobile device. We adopt acoustic signal as the only medium used for the interaction, which can be handled by lightweight signal processing using inexpensive sensors on mobile devices. In our vTrack prototype, the peripheral device simply emits inaudible acoustic signals through a loudspeaker, while the receiving device performs sound source localization by leveraging a multi-channel microphone array. We build a fingerprint-based localization model using various cues, such as time difference of arrival, angle of arrival, and power spectrum density of the audio signal. The vTrack system integrates the frequency difference of arrival incurred by the Doppler shift to track the sound source in motion. Finally, the position estimations are fed into the extended Kalman filter to reduce errors and smooth the output. We implement our system on Android devices and validate its feasibility. Our extensive experiments show that vTrack achieves millimeter-level accuracy in the moving sound source scenario.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要