State Governments Use Biased Software to Predict Crimes

We all know there’s a racial disparity in US criminal prosecutions—but imagine if a computer algorithm was being used to insert even more bias into the criminal justice system. According to a damning new report fromProPublica, that’s exactly what’s happening in many states around the country.

In a common practice known as “risk assessment,” computer software is used to predict the likelihood of a future crime by a specific individual. The only problem is that the computer algorithm that law enforcement agencies are using appears to have a severe racial bias.

The implications of the accusations are huge. Risk assessment software is used to make many decisions in the criminal justice system, including things like bond amounts. In states like Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington, and Wisconsin, risk assessment programs can be used by judges during a criminal sentencing.

As part of its research, ProPublica obtained the risk scores of more than 7,000 people arrested in Broward County, Florida, during a two year period (2013-2014), and tracked these individuals in the two years following. What ProPublica found was that the risk scores were “remarkably unreliable” in determining violent crimes. The results were also weak when a full range of crimes were taken into account. Only 61 percent of those people were arrested for any subsequent crimes within two years.

The research found that the risk assessment algorithm used by Broward County was more likely to flag black defendants as future criminals, labeling them at almost twice the rate as white defendants. In addition, white defendants were mislabeled as low risk more often than black defendants.

In one example from the article, an 18-year-old black female was arrested for stealing a kid’s Huffy bicycle and razor scooter. She had a light criminal record, and the total value of the stolen products was $80. In a similar case, a 41-year-old white male was caught stealing $86 worth of goods from a Home Depot store. The white male had been convicted of armed robbery and served five years in prison. The risk assessment software determined that the black female (who had only previously committed misdemeanors as a juvenile) was more likely to commit a future crime than the white male (who was a seasoned criminal).

In the end, we know the computer algorithm got it exactly wrong. The black female was not charged with any new crimes, and the white male is now serving an eight-year prison term for breaking into a warehouse to steal thousands of dollars in electronics.

source: by Michael Nunez


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.