Lady Justice and the Machines: An Algorithmic Approach to Criminal Justice Reform

California Senate Bill 10 (SB 10), which Governor Jerry Brown signed into law in August (2018), fundamentally transforms California’s approach to pretrial detention.[1] As in most states, California previously operated a ‘cash bail’ system, which allowed individuals suspected of having committed a crime to pay a certain sum of money (bail) and await trial outside the confines of court custody.[2] Critics of cash bail, however, point to fairness issues underlying a system that “makes justice an uneven playing field, incarcerating the poor while allowing those with money or assets to avoid jail time.”[3] This position is evinced with clarity by a co-author of SB 10, who asserted that “[f]reedom and liberty should never be pay to play,” and that “[the existing] system has allowed the wealthy to purchase their freedom regardless of their risk, while the poor who pose no danger languish in jail.”[4]

In response, California state legislators developed an alternative to cash bail that draws heavily upon the first megatrend identified in the TOM Challenge: machine learning. Beginning in October 2019, criminal suspects will undergo algorithmic risk assessments whose outputs will help determine pretrial detention outcomes – these assessments will focus on the likelihood that a detained individual would, if released, commit another offense or flee before trial. Individuals will be identified as either low-, medium-, or high-risk, and pretrial detention decisions will follow from these algorithmic outputs.[5] SB 10 therefore shifts the pretrial detention calculus on two dimensions: ability to pay is supplanted by an assessment of risk profile, and machine learning sidelines some (but not all) aspects of human discretion (the bill does allow for judges and prosecutors, at times, to diverge from the algorithm’s outcome).

While advocates of the change applaud improved outcomes for economic justice, others are concerned that the introduction of machine learning will create new challenges. Three specific vulnerabilities are salient given our in-class debates about machine learning. First, because these risk assessment algorithms are likely to rely on supervised learning,[6] “training data” may contain “systemic biases.”[7] In this way, example inputs and desired outputs may themselves reflect racial biases and other (implicit or explicit) discriminatory effects.[8] Second, because California is allowing localities to choose whether to engage a third-party contractor or build their own algorithms, the fact that third-party contractors may limit transparent access to their risk-assessment-algorithm formulas (under cover of intellectual property law) could make it harder to discern whether (and how) such algorithms are biased.[9] Finally, beyond the potential systemic biases of these algorithms, there may also be community-specific biases if the training data used, for example, is “not representative of the [specific] community that will eventually use the risk assessment.”[10]

Legislators have vowed to take steps, in the short- and medium-term, to mitigate at least some of these issues. A comprehensive review (to identify bias) of the system has already been planned for 2023, and one state senator has said he will draft legislation pushing for transparency in risk assessment algorithms.[11] Yet, critics suggest that vulnerabilities remain. The Electronic Frontier Foundation has been especially vocal, arguing that public servants and individual citizens should demand greater visibility into the source code and more input as to what criteria factor into these risk assessment tools, instead of relying on third-party contractors.[12]  Similarly, the introduction of “regular independent audits” could allay concerns about waiting until the planned review in 2023.[13] Such audits, coupled with transparency, might facilitate more iterative development of the machine learning algorithms at the heart of SB 10. Specifically, they would allow for targeted improvement of the algorithms at a more rapid pace. While streamlined and transparent iteration might begin to address potential operational issues in the near term, however, it remains to be seen whether these interventions could account for more structural biases that may exist in underlying datasets. Needless to say, the stakes of this machine learning experiment are high. Individuals who are deemed risky are detained in jail – ensuring that machine learning directs appropriately proportionate pretrial outcomes is critical.

As applied to pretrial detention and criminal justice reform more broadly, therefore, machine learning may address certain problems while exacerbating others. Are the latter more significant than the former? Are there other considerations, unacknowledged here, that accompany the introduction of machine learning into criminal justice reform and decision-making?

(711 words)

[1] Bryan Anderson and Alexei Koseff, “Vacant governor’s mansion + Bail measure has the votes + California Priorities summit today,” November 9, 2018, https://www.sacbee.com/news/politics-government/capitol-alert/article221389490.html.

[2] American Bar Association, “Steps in a Trial,” December 2, 2013, https://www.americanbar.org/groups/public_education/resources/law_related_education_network/how_courts_work/bail/.

[3] Dave Gershgorn, “California just replaced cash bail with algorithms,” Quartz, September 4, 2018, https://qz.com/1375820/california-just-replaced-cash-bail-with-algorithms/.

[4] “Gov. Brown signs bill eliminating money bail in California,” The Mercury News, August 28, 2018, https://www.mercurynews.com/2018/08/28/gov-brown-signs-bill-eliminating-money-bail-in-california/.

[5] Alexei Koseff, “Jerry Brown signs bill eliminating money bail in California,” Sacramento Bee, August 28, 2018, https://www.sacbee.com/news/politics-government/capitol-alert/article217461380.html.

[6] Building Watson: Not So Elementary, My Dear! (Abridged), “Appendix.”

[7] Hayley Tsukayama and Jamie Williams, “If a Pre-trial Risk Assessment Tool Does Not Satisfy These Criteria, It Needs to Stay Out of the Courtroom,” Electronic Frontier Foundation, November 6, 2018, https://www.eff.org/deeplinks/2018/11/if-pre-trial-risk-assessment-tool-does-not-satisfy-these-criteria-it-needs-stay.

[8] Ibid.

[9] Dave Gershgorn, “California just replaced cash bail with algorithms,” Quartz, September 4, 2018, https://qz.com/1375820/california-just-replaced-cash-bail-with-algorithms/.

[10] Hayley Tsukayama and Jamie Williams, “If a Pre-trial Risk Assessment Tool Does Not Satisfy These Criteria, It Needs to Stay Out of the Courtroom,” Electronic Frontier Foundation, November 6, 2018, https://www.eff.org/deeplinks/2018/11/if-pre-trial-risk-assessment-tool-does-not-satisfy-these-criteria-it-needs-stay.

[11] Dave Gershgorn, “California just replaced cash bail with algorithms,” Quartz, September 4, 2018, https://qz.com/1375820/california-just-replaced-cash-bail-with-algorithms/.

[12] Hayley Tsukayama and Jamie Williams, “If a Pre-trial Risk Assessment Tool Does Not Satisfy These Criteria, It Needs to Stay Out of the Courtroom,” Electronic Frontier Foundation, November 6, 2018, https://www.eff.org/deeplinks/2018/11/if-pre-trial-risk-assessment-tool-does-not-satisfy-these-criteria-it-needs-stay.

[13] Ibid.

 

Previous:

Airbnb: Machine Learning to optimize the future of Gig Economy and the power of economy-sharing

Next:

Love in a Hopeless Place: Machine Learning at OkCupid

Student comments on Lady Justice and the Machines: An Algorithmic Approach to Criminal Justice Reform

  1. Criminal justice reform is badly needed in the United States, particularly our current cash bail system. The current pretrial detention of those with less socio-economic means because they are unable to pay bail is unjust. It fuels a two-tiered justice system – one for the wealthy and one for the poor. California and other states are taking bold steps to attempt to address this problem by introducing algorithms that evaluate risk. As the author acknowledged, these algorithms might have a dark side. Unconscious biases might be “programmed” into the algorithm, which must weigh various factors to determine the risk that someone may not appear for their trial or will be re-arrested. For example, people of color living in certain communities with elevated crime levels could be deemed high risk and jailed before trial as a result. Local authorities may have too much discretion to decide what is considered high risk. Prosecutors and judges may then keep more people in jail. This technology could intensify racial biases and enable an increase in pre-trial incarceration. While there certainly needs to be a replacement of the current cash-bail system, I have my doubts about whether algorithms are the best option. Machine learning may simply reinforce many of the problems that already plague the criminal justice system like widespread social biases against communities of color and people from lower socioeconomic backgrounds.

Leave a comment