EarBender: Enabling Rich IMU-based Natural Hand-to-Ear Interaction in Commodity Earables

UbiComp/ISWC '23 Adjunct: Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing(2023)

引用 0|浏览5
暂无评分
摘要
Earables have been gaining popularity over the past few years for their ease of use and convenience over wired earphones. However, modern-day earables usually have a limited interface, inhibiting their potential as an accessible medium of input. To this end, we present EarBender: an ear-based real-time system that bridges the gap between earables and on-body interaction, providing a more diverse and natural form of interaction with devices. EarBender enables touch-based hand-to-ear gestures on mobile devices by leveraging inertial sensors in commercially available earable devices. Our proposed system detects the slight deformation in a user’s ear resulting from different ear-based actions including swiping and tapping and classifies the action performed. EarBender is designed to be energy-efficient, easy to deploy and robust to different users, requiring little to no calibration. We implement a prototype of EarBender using eSense, a multi-sensory earable platform, and evaluate it in different scenarios and parameter settings. Results show that the system can detect the occurrence of gestures with a 96.8% accuracy and classify seven different hand-to-ear gestures with an accuracy up to 97.4% maintained across four subjects.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要