As part of a bold effort at bail reform, the state of New Jersey replaced bail hearings with algorithmically informed risk assessments this year. Anyone’s eligible for release, no money down, if they meet certain criteria. To ensure unbiased, scientific decisions, judges use a machine-generated score.
The automated recommendation serves as a guide and doesn’t replace judicial discretion. Still, the program raises questions about the claimed neutrality of machine reasoning, and the wisdom of reliance on mechanical judgment.
While innocent until proven guilty is the mantra of US criminal law, defendants frequently stay in jail awaiting trial when judges decide they’re a flight risk or a threat, or they just can’t afford bail. It’s a process rife with human biases.
The Public Safety Assessment (PSA) formula that New Jersey is now using for bail—along with about 20 other jurisdictions that employ it less extensively—aims to make risk calculations neutral, purely evidence-based, and premised on data. It compares risks and outcomes in a database of 1.5 million cases from 300 jurisdictions nationwide, producing a score of one to six for the defendant based on the information.
The PSA looks at three outcomes—appearing in court, committing a new crime, and committing a violent crime. It assesses likelihood of these outcomes by looking at nine factors about age, convictions, prior court appearances, and charges, and calculates a score. The program ignores factors like race, ethnicity, and geography to ensure neutral algorithmic assessments, according to its nonprofit sponsors at the Laura and John Arnold Foundation.
Algorithms are not exempt from accusations of bias, however. Pro Publica reported in May that its review of scores assigned to defendants in Florida—by another program, not the PSA—revealed racial bias and relied on criteria that didn’t accurately predict risk, for example.
That may be because programs are socially constructed, built by implicitly biased people who may be mechanizing prevailing world views, reenforcing norms rather than disrupting them. And even if an algorithm’s designers are unbiased and the formula’s objectivity unquestionable, decisions behind the data aren’t necessarily neutral, leading to potentially flawed calculations premised on records from an un-pure system.
Harvard Law School’s criminal justice policy program published a primer on bail reform (pdf) in 2016, laying out the legal and social dangers of economically premised release and calling for its widespread abolition. Among other problems, bail systems ultimately result in more criminal convictions for the indigent than the monied by forcing poor defendants to plead guilty and take probation deals just to get out of jail, rather than insisting that states prove guilt. Supervised release comes with many conditions, including fee payments, and violations land defendants back in jail, creating a vicious cycle of incarceration with widespread societal consequences.
The Eighth Amendment of the US Constitution bans excessive bails but some jurisdictions have gone further. Federal courts and Washington DC mostly don’t demand bail, and many states have tried various reforms.
New Jersey’s start down the slippery slope of algorithmic justice raises questions about whether the formulas can ever be sufficiently nuanced to content with the many complexities that criminal court judges are forced to consider when making difficult decisions every day. But so far, the state seems to be accomplishing at least its short-term goals. Fewer people are detained for lack of money today than in the past. A 2013 study found that almost 40% of defendants ordered out on bail languished in jail, unable to pay for their freedom. Now, they’re released, regardless of finances, based on their risk assessment.
According to the New York Times, of nearly 3,400 new criminal cases in New Jersey in January, fewer than 10% of defendants remained detained and under 1% got bail, while the rest were free to go.
Correction: A previous version of this story reported that the PSA considers prior charges; it considers only current charges. The story previously stated that Pro Publica reviewed PSA scores assigned to defendants—those scores were assigned by other automated systems and not the PSA.