41 – An Algorithm Said He Was Guilty #ArtificialDecisions #MCC

Eric Loomis was arrested in Wisconsin in 2013. He was accused of driving a car used in a shooting. The trial moved fast. But the sentence came from somewhere unexpected: a software program.

It was called COMPAS. An algorithm that calculates someone’s likelihood of reoffending. It used Eric’s data—his age, prior offenses, neighborhood, answers to a questionnaire—and gave him a score. High. Too high.

The judge saw the score and followed it. “You’re high risk,” the sentence read. So no alternative punishment. Just prison time.

Eric wasn’t allowed to know how the software worked. It was a trade secret. No one could verify if it was accurate, fair, or trained on clean data. It was a black box. And yet it was used to decide his freedom.

He appealed. All the way to the State Supreme Court. He lost. The algorithm stayed. No one questioned it.

But outside the courtroom, people started paying attention. The case made international headlines. Civil rights groups, researchers, journalists began to ask: can a machine decide a sentence without revealing how it thinks? What if it’s biased? What if it’s wrong?

Today, similar algorithms are used to decide who gets a loan, a job, an insurance policy. Closed systems, inaccessible, with no appeal. And most people don’t even know it.

But if the machine can’t be questioned, it can’t be corrected. And then it’s not justice. It’s just automation.

Eric Loomis was judged by a system he couldn’t even see. And no one took responsibility.

This is the new injustice: no one deciding, everyone obeying.

#ArtificialDecisions #MCC #CamisaniCalzolari #MarcoCamisaniCalzolari

Marco Camisani Calzolari
marcocamisanicalzolari.com/biography

Share: