Why Understanding Racial Bias is Crucial for the Responsible Use of Predictive Analytics

logo-chronicle (2)As big data tools like predictive analytics become more prevalent, child-welfare agencies must grapple with implicit racial bias if they want to ensure that it does not cause harm, according to a new white paper published by the Kirwan Institute at Ohio State University.

This week, the paper’s author, Kirwan Research Associate Kelly Capatosto, joined The Chronicle of Social Change’s Publisher Daniel Heimpel for a webinar about her research, which examines how individual, cognitive-level barriers and broader historical- and societal-level barriers can stack the deck of predictive analytics against families and communities of color.

The Kirwan Institute at OSU focuses on education, equity and sustainable communities; public and community health; criminal justice; and how structural racialization and race in cognition create barriers to opportunities in each of these areas.

“Predictive analytics has often been described as a way to predict the future using data from the past,” Capatosto said in the webinar, highlighting its key characteristics of using large data sets, and assigning levels of risk to various outcomes in certain situations.  She explained that predictive analytics is “increasingly sought after to guide decision making in child welfare fields.”

This white paper provides insight into ways in which implicit racial biases must be accounted for to protect communities of color who may be disproportionately represented in certain data inputs that are plugged into predictive analytics models, which are employed to help find hidden patterns, streamline service delivery, and decrease budgets, Capatosto said to webinar attendees.

“Of particular interest for our institute is demonstrating how seemingly neutral data is actually deeply entrenched within our history of racial segregation and inequity,” she said.

Key highlights include considering the ways in which input data to predictive analytics models already may include human bias, as it is based on notes and perceptions of individual service workers.

Certain data points that may be perceived as neutral also are impacted by historical practices informed by race, such as the way redlining—restrictive racial housing covenants—beginning in the 1930s laid the groundwork for neighborhood-level inequity that is felt in many today.

Thus, “using zipcodes … as an input for other dimensions of these [predictive analytics] models … demonstrates more problems than we may readily perceive at the surface,” Capatosto said, noting a particular impact on Black and Latino families when using such place-based approaches.

Additionally, users must consider data that doesn’t exist. For example, families with four or more children, which is a data point that could increase a risk of maltreatment score. Capatosto explained, however, “we don’t have data on the families with four or more children that don’t have any incidents of maltreatment.”

To protect against these potential hazards, the white paper suggests creating a comprehensive code of ethics, developed by an interdisciplinary group that includes experts from a variety of fields connected to child welfare, and working on both a national and local level.

It also points toward the need for increased accountability of predictive analytics models to the families they serve and the child welfare agencies employing them, so that those groups better understand the factors that are plugged into the models.

Assessing the “equity impact” of the models is another recommendation.

“Those who determine the benefits of predictive analytics should be trained to look for existing structures of inequity that may limit the effectiveness of the resulting interventions. For example, racial groups have different experiences when encountering health and social services professionals,” Capatosto wrote in the paper.

Finally, the paper recommends “broadening the scope” of how predictive analytics models are applied, and suggests using tools like opportunity maps to apply a predictive analytics model to offer interventions to a neighborhood rather than to a specific family.

Read the white paper here.

By Elizabeth Green

Written By Chronicle Of Social Change

Why Understanding Racial Bias is Crucial for the Responsible Use of Predictive Analytics was originally published @ The Chronicle of Social Change and has been syndicated with permission.

Sources:

Our authors want to hear from you! Click to leave a comment

Related Posts

Subscribe to the SJS Weekly Newsletter

Leave a Reply