Wisconsin v. Loomis:
The Continuing Saga Of John Henry v. the Steam-Powered Hammer?
Julie Blackman, Ph.D.
John Henry was an African American folk hero and a “steel-driving man.” He hammered steel drills into rock to make holes for explosives that cleared the way for railroad tunnels. According to legend and song, he competed in a race against a steam-powered hammer. He won the race but died with his hammer in his hand. The stress of his exertion stilled his heart.
While attention to the role of artificial intelligence in courtroom decision-making is new, the value-laden competition between humans and machines has a long history. Although the Supreme Court has not yet decided to grant cert., we should pay close attention to the issues raised by Wisconsin v. Loomis.
Eric L. Loomis was sentenced to six years following a conviction for fleeing the police in a car. The length of his sentence was set by a proprietary algorithm called Compas sold by Northpointe Inc. This algorithm generated bar charts that showed Mr. Loomis had a high risk of violence, recidivism and pretrial risk. On appeal, Loomis asked that the factors that drove the algorithm be disclosed. Compas refused, citing their intellectual property rights. Loomis’ conviction was upheld.
Justice Ann Walsh Bradley, writing for the Wisconsin Supreme Court, said that the Compas report added valuable information and that Mr. Loomis’ sentence would have been the same without it. Even so, the court seemed concerned about relying on an undisclosed algorithm to send a man to prison. Justice Bradley discussed a report from ProPublica about Compas that revealed that black defendants in Broward County, Florida, “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism.”
In the end, though, Justice Bradley allowed sentencing judges to use Compas. She wrote that the software could help by “providing the sentencing court with as much information as possible in order to arrive at an individualized sentence.” (See New York Times article by Adam Liptak, “Sent to Prison by a Software Program’s Secret Algorithm”.)
The Loomis case has reached the door of the Supreme Court. This case raises issues not only about the reach of non-disclosure rights – here in a non-commercial context since Loomis is no business competitor – but also about the replacement of human reasoning by a software algorithm. Are computers better than humans at naming prison sentences that comport with justice or is attention to race, for example, inherently unjust, and contrary to Justice Bradley’s opinion, de-individualized instead of individualized? How can a reliance on aggregate data be individualizing, and how can such data be relied on, especially when the stakes are so high?
Computer reasoning continues to reach new and greater heights. Artificial intelligence may defeat chess champions or masters of Go (a game that is popular in Asia). Physicians may turn to computers for diagnostic wisdom; even umpires now defer to the truth of the camera’s record.
If only humans were as accurate and as able to reckon with as many separate factors at once as are machines. Or, if only humans – especially those in positions of power – gave the truth its due. Especially in these turbulent political times, rife with a blatant disregard for the truth, are we more likely to believe that computers can save us from our own human shortcomings? How important are the ineffable qualities of human kindness and acceptance of others’ differences, for example, at a time when our highest public officials speak of travel bans and border walls? Does the current political chaos make it more likely that the courts will yield to the seemingly unflappable wisdom of artificial intelligence?
Especially now, with so much turmoil afoot, its seems crucial to cling to the role of human intelligence and our capacity for respect for the individual. Algorithms may be able to help us but ought not replace us. Nor should they be opaque. Transparency is crucial to justice.
Imagine how different our cultural ethos would be if John Henry had been able to work side-by-side with the steam-powered hammer rather than die trying to outpower it. Then, cooperation might have replaced competition as the driving ethic between artificial and human intelligence. Surely, Compas’ intellectual property rights are inapposite in this case. Compas should be obliged to share the details of its algorithm so that the factors that led up to the assignment of a six-year sentence in Mr. Loomis’ case can be fairly and completely judged. Hopefully, the Supreme Court will decide to hear this case and will overturn Wisconsin’s decision. Stay tuned. . .