As bail reformers continue to their push to implement group-data based criminal justice algorithms to decide bail and conditions of release, the push back against them continues to gain momentum. This time, in conservative Idaho, a group of bipartisan legislators have introduced legislation to end “black-box” algorithms and require that any such algorithms must be determined to be free of bias in order to be used in the State of Idaho.
Representative Greg Chaney and Senator Cherie Buckner-Webb introduced House Bill 118 last week in order to stop what Representative Chaney called “computerized discrimination,” which is embedded in computer codes and baked-in to such algorithms.
RELATED: New bill would ban racist algorithms from setting bail
In addition, there are issues of algorithms being transparent, with large proprietors of algorithms such as the now for-profit Arnold Foundation, are able to assert trade secret protections and contractual protections stopping defendants, judges and prosecutors ever getting to see behind the curtain.
RELATED: Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System
Buckner-Webb and Chaney’s legislation, however, forces such algorithms to be transparent so that independent third parties can validate the work and test for biases.
“The most sacred responsibility of any lawmaker is to ensure that we protect the people we serve. That means everybody, not just certain select groups. Unfortunately, when it comes to criminal justice in Idaho, existing practices in the state threaten the very concept of fairness guaranteed to all of its citizens. The current use of risk assessment to determine the fate of persons who have been arrested and charged with committing a crime, represents a fundamental failing of our system.” – Representative Greg Chaney (R-Caldwell)
RELATED: IDAHO MUST ELIMINATE COMPUTERIZED DISCRIMINATION IN ITS CRIMINAL JUSTICE SYSTEM
Of course, requiring algorithms to at least test for bias would be a big step, since almost none of them have been tested for bias. Yet Buckner-Webb and Chaney did not apparently think that was good enough, putting the burden on those who want to use an algorithm to demonstrate that it is free of bias before putting it into action.
As several states and local jurisdictions continue down the slippery slope of computerized justice, they might want to start asking similar questions.