State Governments Use Racially Biased Software to Predict Crimes

Image: Flickr / joegratz
Image: Flickr / joegratz

We all know there’s a racial disparity in US criminal prosecutions—but imagine if a computer algorithm was being used to insert even more bias into the criminal justice system. According to a damning new report from ProPublica, that’s exactly what’s happening in many states around the country.

Advertisement

In a common practice known as “risk assessment,” computer software is used to predict the likelihood of a future crime by a specific individual. The only problem is that the computer algorithm that law enforcement agencies are using appears to have a severe racial bias.

The implications of the accusations are huge. Risk assessment software is used to make many decisions in the criminal justice system, including things like bond amounts. In states like Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington, and Wisconsin, risk assessment programs can be used by judges during a criminal sentencing.

Advertisement

As part of its research, ProPublica obtained the risk scores of more than 7,000 people arrested in Broward County, Florida, during a two year period (2013-2014), and tracked these individuals in the two years following. What ProPublica found was that the risk scores were “remarkably unreliable” in determining violent crimes. The results were also weak when a full range of crimes were taken into account. Only 61 percent of those people were arrested for any subsequent crimes within two years.

The research found that the risk assessment algorithm used by Broward County was more likely to flag black defendants as future criminals, labeling them at almost twice the rate as white defendants. In addition, white defendants were mislabeled as low risk more often than black defendants.

In one example from the article, an 18-year-old black female was arrested for stealing a kid’s Huffy bicycle and razor scooter. She had a light criminal record, and the total value of the stolen products was $80. In a similar case, a 41-year-old white male was caught stealing $86 worth of goods from a Home Depot store. The white male had been convicted of armed robbery and served five years in prison. The risk assessment software determined that the black female (who had only previously committed misdemeanors as a juvenile) was more likely to commit a future crime than the white male (who was a seasoned criminal).

In the end, we know the computer algorithm got it exactly wrong. The black female was not charged with any new crimes, and the white male is now serving an eight-year prison term for breaking into a warehouse to steal thousands of dollars in electronics.

Advertisement

[ProPublica]

Technology editor at Gizmodo.

Share This Story

Get our newsletter

DISCUSSION

conlawhero
ConLawHero

Ah yes, the secretive racism of objective data.

So... if, let’s just say, white people committed a large number of crimes and had a high recidivism rate, it’d be racist to state those terms and use them as factors in sentencing?

Is it sexist that women pay less than men for car insurance because on average women get into less accidents (or less costly accidents) than men? For example, having driven for nearly 18 years, I’ve never gotten a speeding ticket, never gotten a DUI, and I’ve never been in an accident. My individual example is evidence that I, as an individual, am a good driver. However, that doesn’t mean ALL MEN are good drivers. In fact, the evidence points to the contrary. Do you think insurance companies are in the business of losing money? If the data showed that women cost insurers more, women would pay more. QED.

How is data, objective data mind you, racist?

Jesus, I consider myself pretty damn liberal, but people who pull this shit make me want to donate to Donald Trump. I loath to be affiliated with people who don’t understand basic statistics. (Though, the same could be said for the average Trump fan, probably Trump himself, but at least we know they’re idiots).