Extended Analysis of "How Child Welfare Workers Reduce Racial Disparities in Algorithmic Decisions"

Conference on Human Factors in Computing Systems(2022)

引用 32|浏览4
暂无评分
摘要
This is an extended analysis of our paper "How Child Welfare Workers Reduce Racial Disparities in Algorithmic Decisions," which looks at racial disparities in the Allegheny Family Screening Tool, an algorithm used to help child welfare workers decide which families the Allegheny County child welfare agency (CYF) should investigate. On April 27, 2022, Allegheny County CYF sent us an updated dataset and pre-processing steps. In this extended analysis of our paper, we show the results from re-running all quantitative analyses in our paper with this new data and pre-processing. We find that our main findings in our paper were robust to changes in data and pre-processing. Particularly, the Allegheny Family Screening Tool on its own would have made more racially disparate decisions than workers, and workers used the tool to decrease those algorithmic disparities. Some minor results changed, including a slight increase in the screen-in rate from before to after the implementation of the AFST reported our paper.
更多
查看译文
关键词
human-centered AI, machine learning, algorithmic biases, algorithmassisted, decision-making, child welfare
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要