Predicting Crime

Recidivism risk algorithms – Should AI decide who goes free and who goes to jail?

Berten Verbeeck & Cem Pektas

Recidivism risk algorithms


The prevalence of algorithmic assessment tools has been growing rapidly in recent years. Across industries, organizations have been developing use cases for predictive analytical models, ranging from determining your car insurance premium to setting your credit score.


Many of these practices are now well established. Yet, in 2016, major controversy was stirred when ProPublica published an investigation into the use of recidivism risk algorithms in America’s criminal justice system ( In their article, ProPublica’s researchers lay out how they analyzed data from a widely used recidivism risk assessment tool called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions).


The idea behind COMPAS is simple: Have artificial intelligence process enormous quantities of data on issues such as age, sex, employment, current criminal charge, number of past convictions, etc. to provide advice on whether or not a specific individual will commit another offense in the future. Proponents of the system have argued that it helps judges and parole officers make better-informed, data-driven decisions. However, ProPublica’s investigation claims that COMPAS is biased against African Americans. In their article, the authors state: “Black defendants were often predicted to be at a higher risk of recidivism than they actually were. Our analysis found that black defendants who did not recidivate over a two-year period were nearly twice as likely to be misclassified as higher risk compared to their white counterparts (45 percent vs. 23 percent).”


While some later researchers have criticized ProPublica’s methodology – and have argued that algorithm-based tools vastly outperform humans in predicting recidivism (see, for example, – there is no doubt that systems such as COMPAS risk perpetuating established biases. When considering the context of America’s criminal justice system, this is particularly worrisome for two reasons. First, the stakes are incredibly high, with individuals’ most fundamental possession – their freedom – in the balance. Second, America’s criminal justice system has a well-documented history of racial bias, and it is not unthinkable that African Americans – when evaluated based on an algorithm – may find themselves discriminated against once again due to overrepresentation in past arrest data.


This does not mean we have to do away with recidivism risk algorithms altogether. Tools such as COMPAS can be valuable in highly complex cases, where vast numbers of factors – more than the human mind can process – need to be taken into account. Yet, two fundamental conditions need to be put in place before authorizing its use. First, the algorithm should never replace judges. It can be leveraged as one input into a judge’s overall decision making – a complement to existing methods – but can never be allowed to make decisions independent of any human. Second, any judge who uses the algorithm must be able adequately interpret and explain its workings. A judge must be able to determine when certain situational factors are not considered by the algorithm, and then be capable to overwrite the algorithm’s recommendations. When a judge does decide to follow the algorithm’s advice, he/she must be able to spell out in clear language what drove the decision-making. As per the Fifth Amendment, any defendant has the right to “due process of law”. This includes the right to understand how one’s algorithmic score was calculated and the right to challenge the score’s accuracy. Blindly following an algorithm would mean depriving defendants of their right to due process.


Should your employer care how much sleep you’re getting?


Keeping Performance Reviews Honest: Lessons from the Marines

Student comments on Predicting Crime

  1. Bert and Cem, thank you for covering an fascinating topic. I appreciate your evaluation of the possible ways to counter potential flaws of this tool.

    My first reaction to this risk assessment tool was a little mixed. First of all, it immediately brought to mind anecdotal evidence of recidivist “being back at it” and causing harm again that frustrates the society. It makes any tool that would help strengthen the judges ability to make the “right” decision is appealing and seems justified (given they use it as a supplement along with the ‘checklist’ you described). On the other hand, the mindset shift I experienced watching the most recent cases of Archie Williams or Robert DuBois is truly terrifying, especially with racial biases that you mentioned. Either way, it is a high stakes decision with no margin for error. Biases is a very sensitive and current topic.

    DNA tests really transformed the justice system. Largely thanks to the trust in the science behind it. I found this statistics fascinating I wonder if one of the ways Compas could build trust and strengthen its position is exploring areas for including risk assessment of wrongful convictions as well The type of data is not a lot to come by but could be interesting for them to explore.

    A judge equipped with a set of tools, checklists and regulations for data analytics could also, be equipped with the balanced score of potential risk of their erroneous decision for convictions in the first place. Maybe it will help – behavioral economists suggest balancing the hope of the “benefits of change” with the fear of the “cost of no change”. Not for an individual case as we cannot connect them but on a systemic level.

Leave a comment