EmoLeak: Smartphone Motions Reveal Emotions

2023 IEEE 43rd International Conference on Distributed Computing Systems (ICDCS)(2023)

引用 0|浏览40
暂无评分
摘要
Emotional state leakage attracts increasing concerns as it reveals rich sensitive information, such as intent, demo graphic, personality, and health information. Existing emotion recognition techniques rely on vision and audio data, which have limited threat due to the requirements of accessing restricted sensors (e.g., cameras and microphones). In this work, we first investigate the feasibility of detecting the emotional state of people in the vibration domain via zero-permission motion sensors. We find that when voice is being played through a smartphone's loudspeaker or ear speaker, it generates vibration signals on the smartphone surface, which encodes rich emotional information. As the smartphone is the go-to device for almost everyone nowadays, our attack based only on motion sensors raises severe concerns about emotion state leakage. We comprehensively study the relationship between vibration data and human emotion based on several publicly available emotion datasets (e.g., SAVEE, TESS). Time-frequency features and machine learning techniques are developed to determine the emotion of the victim based on speech vibrations. We evaluate our attack on both the ear speakers and loudspeakers on a diverse set of smartphones. The results demonstrate our attack can achieve a high accuracy, with around 95.3% (random guess 14.3%) accuracy for the loudspeaker setting and 60.52% (random guess 14.3%) accuracy for the ear speaker setting.
更多
查看译文
关键词
emotion recognition,speech privacy,motion sensor,side channel
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要