Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Friday, July 27, 2018

Morality in the Machines

Erick Trickery
Harvard Law Bulletin
Originally posted June 26, 2018

Here is an excerpt:

In February, the Harvard and MIT researchers endorsed a revised approach in the Massachusetts House’s criminal justice bill, which calls for a bail commission to study risk-assessment tools. In late March, the House-Senate conference committee included the more cautious approach in its reconciled criminal justice bill, which passed both houses and was signed into law by Gov. Charlie Baker in April.

Meanwhile, Harvard and MIT scholars are going still deeper into the issue. Bavitz and a team of Berkman Klein researchers are developing a database of governments that use risk scores to help set bail. It will be searchable to see whether court cases have challenged a risk-score tool’s use, whether that tool is based on peer-reviewed scientific literature, and whether its formulas are public.

Many risk-score tools are created by private companies that keep their algorithms secret. That lack of transparency creates due-process concerns, says Bavitz. “Flash forward to a world where a judge says, ‘The computer tells me you’re a risk score of 5 out of 7.’ What does it mean? I don’t know. There’s no opportunity for me to lift up the hood of the algorithm.” Instead, he suggests governments could design their own risk-assessment algorithms and software, using staff or by collaborating with foundations or researchers.

Students in the ethics class agreed that risk-score programs shouldn’t be used in court if their formulas aren’t transparent, according to then HLS 3L Arjun Adusumilli. “When people’s liberty interests are at stake, we really expect a certain amount of input, feedback and appealability,” he says. “Even if the thing is statistically great, and makes good decisions, we want reasons.”

The information is here.