"Garbage In, Garbage Out": Mitigating Human Biases in Data Entry by Means of Artificial Intelligence.

INTERACT (3)(2023)

引用 0|浏览9
暂无评分
摘要
Current HCI research often focuses on mitigating algorithmic biases. While such algorithmic fairness during model training is worthwhile, we see fit to mitigate human cognitive biases earlier, namely during data entry. We developed a conversational agent with voice-based data entry and visualization to support financial consultations, which are human-human settings with information asymmetries. In a pre-study, we reveal data-entry biases in advisors by a quantitative analysis of 5 advisors consulting 15 clients in total. Our main study evaluates the conversational agent with 12 advisors and 24 clients. A thematic analysis of interviews shows that advisors introduce biases by “feeling” and “forgetting” data. Additionally, the conversational agent makes financial consultations more transparent and automates data entry. These findings may be transferred to various dyads, such as doctor visits. Finally, we stress that AI not only poses a risk of becoming a mirror of human biases but also has the potential to intervene in the early stages of data entry.
更多
查看译文
关键词
human biases,data entry,artificial intelligence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要