U.S. Courts Are Using Algorithms Riddled With Racism to Hand Out Sentences

U.S. Courts Are Using Algorithms Riddled With Racism to Hand Out Sentences
Source: Flickr
Source: Flickr

For years, the criminal justice community has been worried. Courts across the country are assigning bond amounts sentencing the accused based on algorithms, and both lawyers and data scientists warn that these algorithms could be poisoned by the prejudices these systems were designed to escape.

Until now, that concern was pure speculation. Now, we know the truth.

An investigation published Monday morning by ProPublica analyzed the results of thousands of sentences handed out by algorithms, and found that these formulas are easier on white defendants, even when race is isolated as a factor.

"The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants," the investigative team wrote.

The algorithms don't take race directly into account, but instead use data that stands in for correlative information that could stand in as a proxy. The Florida algorithm evaluated in the report is based on 137 questions, such as "Was one of your parents ever sent to jail or prison?" and "How many of your friends/acquaintances are taking drugs illegally?" 

Those two questions, for example, can appear to evaluate someone's empirical risk of criminality, but instead, they target those already living under institutionalized poverty and over-policing. Predominantly, those people are people of color.

"[Punishment] profiling sends the toxic message that the state considers certain groups of people dangerous based on their identity," University of Michigan law professor Sonja Starr wrote in the New York Times in 2014. "It also confirms the widespread impression that the criminal justice system is rigged against the poor."

The algorithm itself, of course, was not available for audit. Algorithms that inform decisions in the public sector are often developed and protected by private companies — Northpointe, a for-profit company that created the algorithm examined by ProPublica, told ProPublica that it does not agree with the results of the analysis. It "accurately reflect the outcomes" of its product, Northpointe said. 

White defendants were routinely given lower threat scores than black defendants.
Source: 
ProPublica

But the controversy over sentencing is just one early instance of a growing conversation about bias in the algorithms that decide everything from what news we see to how and where we travel.

It's time to talk about algorithms: Algorithms seem impervious from the insidious influence of racism and prejudice, human innovations that can unconsciously creep into our fallible decision-making processes. Evaluations that come from algorithms imply that the results are scientific — spat out by a cold computer working only with evidence. The process of sentencing by algorithms is even formally referred to as "evidence-based sentencing."

"Scores give us simplistic ways of thinking that are very hard to resist," Cathy O'Neil, a data scientist and author of the upcoming book Weapons of Math Destruction, said by phone. "If you assign people scores and someone has a low score, it's human nature to assign blame to that person, even if that score just means they were born in a poor neighborhood."

But just because algorithms are mathematical in nature, doesn't mean they're free from human bias. Algorithms spot and amplify patterns in human behavior, and they do it by looking at the data created by human behavior. Predictive policing algorithms that help police chiefs assign their patrols rely on crime statistics and records generated by police behavior, eventually amplifying the prejudicial behaviors that led to that data in the first place.

As more news emerges of bias in algorithms — whether it's the potential anti-conservative bias of Facebook's news algorithm or pricing schemes that charge Asian communities more for SAT tutoring — the world is further disavowed of the idea that algorithms can't be as skewed as human reasoning.

Often, they are skewed in precisely the same way we are.

How much do you trust the information in this article?

Jack Smith IV

Jack Smith IV is a senior writer covering technology and inequality. Send tips, comments and feedback to jack@mic.com.

MORE FROM

Mentally ill prisoners in Louisiana forced to bark like dogs for food, lawsuit claims

Investigators came. Everyone was told not to speak to them.

Philando Castile’s mother supports Justine Damond’s family at march in Minneapolis

"We're just here to support the family," she said. "That's all."

Lawyer says Justine Damond is “most innocent” police shooting victim he’s ever seen

Observers were quick to point out that several black children have been killed by police.

A jail in Tennessee is offering reduced sentences for voluntary vasectomies

Eugenics lives on in 2017.

How the media covers “honor killings” in different ways for women of different religions

Experts say these deaths should be viewed as acts of domestic violence.

Muslim bus driver in Atlanta is suing elementary school principal for discrimination

Twanesia Crawford alleges she was not allowed in Hope Hill Elementary School because she was praying on her school bus.

Mentally ill prisoners in Louisiana forced to bark like dogs for food, lawsuit claims

Investigators came. Everyone was told not to speak to them.

Philando Castile’s mother supports Justine Damond’s family at march in Minneapolis

"We're just here to support the family," she said. "That's all."

Lawyer says Justine Damond is “most innocent” police shooting victim he’s ever seen

Observers were quick to point out that several black children have been killed by police.

A jail in Tennessee is offering reduced sentences for voluntary vasectomies

Eugenics lives on in 2017.

How the media covers “honor killings” in different ways for women of different religions

Experts say these deaths should be viewed as acts of domestic violence.

Muslim bus driver in Atlanta is suing elementary school principal for discrimination

Twanesia Crawford alleges she was not allowed in Hope Hill Elementary School because she was praying on her school bus.