When someone is charged with a crime, bail is determined in part based on a risk assessment. The court decides whether the defendant, if allowed to go free, will fail to appear for trial or will commit another crime.
Unfortunately, defendants of color have historically been assessed as riskier than their white peers. That leaves people of color to languish in pretrial detention at greater rates than whites, putting their lives and livelihoods at risk.
With the rise of Big Data, it seemed like a more race-neutral risk assessment method could be devised. Groups have developed risk assessment algorithms meant to consider only factual data, although demographics are often used. For example, these tools might consider the defendant's history of arrests, charges and convictions, along with the seriousness of the crime they're currently charged with.
Hopes were high that these risk assessment tools would create fairer outcomes in the setting and denial of bail. However, there is reason to believe that these tools simply reinforce past racism.
Even a strictly race-neutral assessment tool produced racist outcomes
African-Americans, Latinos and Native Americans are unjustly arrested at greater rates than whites. Therefore, including past arrests could to produce unjust results for people of color.
"There's no way to square the circle there, taking the bias out of the system by using data generated by a system shot through with racial bias," says a spokesperson for the Center for Court Innovation, a criminal justice reform agency that recently released a study on risk-assessment algorithms.
The Center was following up on earlier research by the independent newsroom ProPublica. In that study, reporters examined bail decisions in Broward County, Florida, assisted by a commercial risk-assessment tool called COMPAS.
The reporters found that African-Americans were much more likely to have been deemed high risk but then show up for court and commit no additional crimes. Whites were much more likely to be labeled low risk but then go on to commit another crime or skip court.
The Center created what seemed a truly race-neutral risk assessment tool. They applied the tool to over 175,000 people in New York City in 2015, estimating a risk for each person. They then compared their estimated risk with the actual outcomes.
Much to their surprise, their risk-assessment tool had the same problem as COMPAS. Blacks were more likely to be falsely deemed high risk, while whites were more likely to be falsely deemed low risk.
The Center did find a way to all-but eliminate the racial disparities. Judges would release anyone charged with a non-violent crime. When they applied the risk-assessment tool only to those charged with violent crimes, the assessments began to match the actual outcomes.