Human decision-making biases in the moral dilemmas of autonomous vehicles

SCIENTIFIC REPORTS(2019)

引用 36|浏览10
暂无评分
摘要
The development of artificial intelligence has led researchers to study the ethical principles that should guide machine behavior. The challenge in building machine morality based on people’s moral decisions, however, is accounting for the biases in human moral decision-making. In seven studies, this paper investigates how people’s personal perspectives and decision-making modes affect their decisions in the moral dilemmas faced by autonomous vehicles. Moreover, it determines the variations in people’s moral decisions that can be attributed to the situational factors of the dilemmas. The reported studies demonstrate that people’s moral decisions, regardless of the presented dilemma, are biased by their decision-making mode and personal perspective. Under intuitive moral decisions, participants shift more towards a deontological doctrine by sacrificing the passenger instead of the pedestrian. In addition, once the personal perspective is made salient participants preserve the lives of that perspective, i.e. the passenger shifts towards sacrificing the pedestrian, and vice versa. These biases in people’s moral decisions underline the social challenge in the design of a universal moral code for autonomous vehicles. We discuss the implications of our findings and provide directions for future research.
更多
查看译文
关键词
Human behaviour,Social behaviour,Science,Humanities and Social Sciences,multidisciplinary
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要