The Use of AI in The Criminal Justice System Should Alarm All of Us
Many people likely do not know about a phenomenon that is known as criminal sentencing AI, whereby an algorithm is used in the criminal justice system to estimate the likelihood of a criminal defendant committing future crime. In a number of cases, reports that are based on these algorithms are used and relied on by prosecutors to determine what is considered a fair punishment; for example, whether they are OK with probation versus juvenile detention.
Unfortunately, the underlying methodology that these reports are based on is extremely concerning, and includes heightened risk assessment based on factors that are unquestionably racially biased, such as having a negative attitude about the police and living in government-subsidized housing. In addition, the assessment technology has not been properly validated by any scientific or judicial body. In effect, for more than a decade, the number of juvenile defendants, for example, have been judged and committed to detention facilities due to a tool that is based on an unpublished graduate student’s thesis.
“Machine Bias” & Opacity
In reality, criminal assessment tools like this are still being used all over the country; playing a devastating role in our criminal justice system, where a number of cities and states are placing citizen’s lives in the hands of algorithms that are nothing more than mathematical expressions of biases.
Take, for example, the risk assessment tools known as COMPAS, which has been consulted by a number of judges to reject plea deals in connection with a number of minor crimes and, as a result, justify imposing new sentences that double a defendant’s time served in prison. COMPAS scored 7,000 people arrested in Broward County, Florida, and in comparing those scores with the criminal histories of the same people years after they were arrested, the scores ended up being remarkably unreliable in forecasting violent crime, where only 20 percent of those predicted to commit future crimes actually did so. In addition, the algorithm was twice as likely to falsely flag black defendants as future criminals compared to white defendants.
What’s perhaps even more concerning is that these companies do not have to share any information about how the technology works with the courts, therefore, no one has any idea how the scores are computed.
Positive Potential?
Still, some argue that these algorithms can also play a positive role. For example, some states have only been willing to eliminate cash bail if judges are allowed to consult an algorithm that determines whether the defendant is a high risk for future crimes. Some argue that these algorithms can be part of a plan for states to decrease incarceration and overall crime by separating out defendants who are more likely to re-commit violent crimes.
If You Have Been Accused of a Crime, Contact Our Florida Criminal Defense & Civil Rights Attorneys
Failing to work with an experienced, reliable criminal defense attorney is what makes you vulnerable to corrupt tools and biases like these. If you have been arrested for a crime, contact our experienced Orlando & Miami criminal attorneys at the Baez Law Firm today to find out how we can help.
Resources:
propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
theatlantic.com/ideas/archive/2019/06/should-we-be-afraid-of-ai-in-the-criminal-justice-system/592084/
https://www.baezlawfirm.com/time-for-additional-criminal-justice-reform/