Criminal Risk Assessment Algorithms Are as Biased as Their Creators

The United States, which has about 5 percent of the world’s population, accounts for 21 percent of incarcerated people globally. The racial distribution of people behind bars looks quite a bit different from the racial distribution of the country as a whole. While Black people make up 13.4 percent of the US population, according to the US Census, they account for 40 percent of the incarcerated population. According to the National Association for the Advancement of Colored People (NAACP), Black people are incarcerated at five times the rate of white people in the United States. It’s safe to say that our criminal justice system is not just, at least when it comes to race.

In recent years, a new solution has been introduced to make courts more objective: computer algorithms. These “criminal risk assessment algorithms” help overburdened courts decide whether they think people will reoffend. After inputting a series of points of information about the defendant, the algorithm delivers a single number recidivism score.

In addition to helping judges decide how long sentencing should be or whether a defendant should receive rehabilitation services, the algorithms are supposed to make the criminal justice system more racially equitable. The idea is that, while judges and prosecutors may not be able to overcome their own conscious or unconscious biases against Black people and other people of color, a computer does not possess the same baggage. A computer program lacks trauma from our country’s history of slavery; an algorithm cannot make assumptions based on a defendants’ hairstyle or dress; a series of ones and zeros does not know the implications of the word “thug.” These criminal risk assessment algorithms have been introduced in courts across the country.

While they don’t specifically ask the defendant’s race, they do ask “dog whistle” questions that are clearly designed to determine race.

For example, a 2014 investigation by ProPublica revealed a leading 137-question questionnaire from the for-profit company Northpointe, whose criminal risk assessment algorithm software is one of the most commonly used. ProPublica identified a total of 11 potentially racially biased questions, including: “based on the screener’s observations, is this person a suspected or admitted gang member?”, “How many of your friends/acquaintances have ever been arrested?”, and “In your neighborhood, have some of your friends or family been crime victims?”

“These calculations can ultimately lead to people being punished for what they might do rather than what they have actually done, which would seem to violate our standard conception of due process,” Robert Werth, a senior lecturer in sociology in Rice University’s School of Social Sciences, told Futurity.

Their investigation also found that the algorithm predicted that Black people were 77 percent more likely to be at a higher risk of committing a violent crime in the future and 45 percent more likely to commit a crime of any kind in the future. They also found that the algorithm was “somewhat more accurate than a coin flip” when it came to predicting recidivism, with 61 percent of the defendants the algorithm pegged as likely to reoffend actually reoffending over a two year period. So what’s going on here? What happened to those objective computers; those non-racist ones and zeros? The answer comes down to the simple fact that computer algorithms aren’t written in a vacuum.

They’re created by people — most often white men, as 71.3 percent of computer programmers are white and 78.6 percent are male. Because people always carry history, prejudice, and conscious and unconscious bias within them, those biases get passed on to the algorithms. Computer algorithms aren’t magic, and as long as people are prejudiced, they will be too. It looks like it’s time to find a new solution for fixing our criminal justice system.


About the Author

Emma McGowan is a veteran blogger, SFSI-endorsed sex educator, and Bustle's sex advice columnist at Sex IDK. Her work has appeared in Bustle, Startups.co, Unbound, Mashable, Broadly, The Daily Dot's The Kernel, Mic, Bedsider, and The Bold Italic. Follow her on Twitter @MissEmmaMcG.