Excerpt from KTVB News, Feb 14, 2019
Two Treasure Valley lawmakers from opposite sides of the aisle are teaming up on a bill they say will keep defendants from being unfairly labeled by a computer algorithm as likely future criminals.
BOISE, Idaho — Two Treasure Valley lawmakers from opposite sides of the aisle are teaming up on a bill they say will keep defendants from being unfairly labeled by a computer algorithm as likely future criminals.
House Bill 118, introduced last week by Rep. Greg Chaney, R-Caldwell, and Sen. Cherie Buckner-Webb, D-Boise, takes aim at the pretrial risk-assessment algorithms used in Idaho courtrooms to give judges an idea how likely a person is to commit more crimes in the future.
A defendant’s risk level – whether low, medium, or high - can affect everything from how high their bond is set, whether they receive probation or prison time, and even whether they could be a candidate for early release.
But a 2016 investigation by ProPublica found that some computer programs used to determine those scores can have a racial bias, routinely scoring minority defendants as a higher risk to re-offend than white defendants.
The report, which looked at a sample of 7,000 people arrested in Florida and risk-assessed by Northpointe’s COMPAS computer algorithm found that black defendants were incorrectly flagged as future criminals almost twice as often as whites, and that white defendants were mislabeled as low-risk – someone unlikely to commit another crime – more often than black defendants.
Even when the study controlled for other factors, isolating the effect of race from age, gender, criminal history, and recidivism, black defendants were still 45 percent more likely to be predicted to commit any crime in the future, and 77 percent more likely to be rated as a risk to commit a future violent crime.
The risk-assessment algorithms are not wrong every time. But when they are, the designation of who will go on to commit more crimes can mean defendants are unjustly handed lengthy sentences on the weight of what they might do in the future.