Gender Biases in Tone Analysis: A Case Study of a Commercial Wearable.

EAAMO(2023)

引用 0|浏览2
暂无评分
摘要
In addition to being a health and fitness band, the Amazon Halo offers users information about how their voices sound, i.e., their ‘tones’. The Halo’s tone analysis capability leverages machine learning, which can lead to potentially biased inferences. We develop an auditing framework to evaluate the Amazon Halo’s tone analysis capabilities for gender biases. Our results show that the Halo exhibits statistically significant gender biases, when the same emotion is conveyed by professional women and men actors through their recorded voices. For example, we find that over 75% of the words used by the Halo to describe men’s emotions are positive whereas fewer than 50% of the words used by the Halo to describe women’s voices are positive. The Halo describes women as being ‘angry’, ‘disappointed’, ‘uncomfortable’, and ‘annoyed’ more often than men (adjectives with negative valence). The Halo describes men as being ‘knowledgeable’, ‘confident’, and ‘focused’ more often than women (adjectives with positive valence). Overall, our findings underscore that even commercially deployed ML models for day-to-day consumer use exhibit strong biases.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要