Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Friday, May 5, 2023

Is the world ready for ChatGPT therapists?

Ian Graber-Stiehl
Nature.com
Originally posted 3 May 23

Since 2015, Koko, a mobile mental-health app, has tried to provide crowdsourced support for people in need. Text the app to say that you’re feeling guilty about a work issue, and an empathetic response will come through in a few minutes — clumsy perhaps, but unmistakably human — to suggest some positive coping strategies.

The app might also invite you to respond to another person’s plight while you wait. To help with this task, an assistant called Kokobot can suggest some basic starters, such as “I’ve been there”.

But last October, some Koko app users were given the option to receive much-more-complete suggestions from Kokobot. These suggestions were preceded by a disclaimer, says Koko co-founder Rob Morris, who is based in Monterey, California: “I’m just a robot, but here’s an idea of how I might respond.” Users were able to edit or tailor the response in any way they felt was appropriate before they sent it.

What they didn’t know at the time was that the replies were written by GPT-3, the powerful artificial-intelligence (AI) tool that can process and produce natural text, thanks to a massive written-word training set. When Morris eventually tweeted about the experiment, he was surprised by the criticism he received. “I had no idea I would create such a fervour of discussion,” he says.

(cut)

Automated therapist

Koko is far from the first platform to implement AI in a mental-health setting. Broadly, machine-learning-based AI has been implemented or investigated in the mental-health space in three roles.

The first has been the use of AI to analyse therapeutic interventions, to fine-tune them down the line. Two high-profile examples, ieso and Lyssn, train their natural-language-processing AI on therapy-session transcripts. Lyssn, a program developed by scientists at the University of Washington in Seattle, analyses dialogue against 55 metrics, from providers’ expressions of empathy to the employment of CBT interventions. ieso, a provider of text-based therapy based in Cambridge, UK, has analysed more than half a million therapy sessions, tracking the outcomes to determine the most effective interventions. Both essentially give digital therapists notes on how they’ve done, but each service aims to provide a real-time tool eventually: part advising assistant, part grading supervisor.

The second role for AI has been in diagnosis. A number of platforms, such as the REACH VET program for US military veterans, scan a person’s medical records for red flags that might indicate issues such as self-harm or suicidal ideation. This diagnostic work, says Torous, is probably the most immediately promising application of AI in mental health, although he notes that most of the nascent platforms require much more evaluation. Some have struggled. Earlier this year, MindStrong, a nearly decade-old app that initially aimed to leverage AI to identify early markers of depression, collapsed despite early investor excitement and a high-profile scientist co-founder, Tom Insel, the former director of the US National Institute of Mental Health.