Understanding the Mechanism of Racial Bias in Predictive Risk Models of Child Welfare

Date
2021
Authors
Dinh, Huyen
Supervisor
Ryan, Matthew
Vaithianathan, Rhema
Item type
Dissertation
Degree name
Master of Business
Journal Title
Journal ISSN
Volume Title
Publisher
Auckland University of Technology
Abstract

Each year approximately 3.6 million children in the US are referred to Child Protective Services (CPS) – despite these high levels of surveillance, child maltreatment deaths have not fallen. Additionally, many children who are victims of abuse and neglect come to the attention of CPS when it is too late and where early intervention might have helped them. That is where Predictive Risk Modelling (PRM), a type of statistical algorithm that uses linked administrative data to predict the likelihood of adverse events happening in the future, comes into play. The PRM tool typically estimates a child’s risk of abuse and neglect at the time of birth, then its predictions are employed to assist decision-making for connecting families to prevention services before incidents of abuse and neglect occur. However, there are growing concerns about racial disparity around the use of PRM in the child maltreatment context: whether it will reproduce, or even exacerbate, human bias. This study focuses on understanding one of the causes of machine bias, which is measurement error or target variable bias. In particular, the research investigates whether the use of a proxy variable, which is foster care placement in our context, can potentially lead to racial disparity in child maltreatment predictions.

Description
Keywords
Machine bias , Racial bias , Machine learning , Predictive risk modelling , Child welfare , Proxy variable bias , Measurement error in proxy variable
Source
DOI
Publisher's version
Rights statement