Pretrial Risk Assessment Is Biased And Indefensible
By Jeffrey Clayton | August 30, 2020, 8:02 PM EDT
(excerpt from Law360)
The Conference of Chief Justices, an organization of the highest-ranking judges and justices in the nation, sits at the crux of a major fight that looms regarding the continuing use of pretrial risk assessment tools in our criminal justice system. The organization's stated policy supporting its use is perplexing and has become increasingly difficult to defend.
Evidence is mounting that these algorithms actually harm defendants, even as chief justices in more than 20 states aggressively advocate for their adoption. Many county jurisdictions in nearly every state already utilize risk assessment in some form or another. On the federal level, it has been nearly fully implemented in every district across the country.
In the early 2010s, the group adopted a policy in support of the use of algorithms that were designed to make decisions as to the release or detention of defendants. They also functioned to determine appropriate bail and conditions of release, which included supervision by pretrial service agencies or private entities.
These computational factors have had a huge presence, telling police where to go, prosecutors when to divert, and judges how to sentence defendants and set bail and release conditions. They have also directed parole and probation officers on how to supervise defendants and when to grant early termination.
At the time, it was believed that the algorithms would reduce racial and other bias by making the system more objective, in addition to significantly reducing generational mass incarceration. According to the Conference of Chief Justices, the goal was to support "evidence-based risk assessment" in all bail decisions throughout the United States.
However, by the midpoint of the decade, algorithms began to face greater scrutiny when it became apparent that they were based on factors beyond the control of defendants, but for which these individuals were facing severe consequences.
Former U.S. Attorney General Eric Holder was the first to raise alarm, questioning the specific use of demographic factors that correlated with race and poverty. Holder was concerned that certain characteristics of a defendant, such as their education level, socioeconomic background or even the neighborhood from which they were raised might "exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society."
Some algorithms register scores for things like whether one owns or rents their housing, if one has a home phone, or the length of time one has been living at an address. Others consider prior mental health treatment, or rehabilitation for alcohol or substance use disorders. This is problematic because of the possibility it violates the Americans with Disabilities Act and other state companion statutes.
Facebook Comments