Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Therapy. Show all posts
Showing posts with label Therapy. Show all posts

Thursday, December 21, 2023

Chatbot therapy is risky. It’s also not useless

A.W. Ohlheiser
vox.com
Originally posted 14 Dec 23

Here is an excerpt:

So what are the risks of chatbot therapy?

There are some obvious concerns here: Privacy is a big one. That includes the handling of the training data used to make generative AI tools better at mimicking therapy as well as the privacy of the users who end up disclosing sensitive medical information to a chatbot while seeking help. There are also the biases built into many of these systems as they stand today, which often reflect and reinforce the larger systemic inequalities that already exist in society.

But the biggest risk of chatbot therapy — whether it’s poorly conceived or provided by software that was not designed for mental health — is that it could hurt people by not providing good support and care. Therapy is more than a chat transcript and a set of suggestions. Honos-Webb, who uses generative AI tools like ChatGPT to organize her thoughts while writing articles on ADHD but not for her practice as a therapist, noted that therapists pick up on a lot of cues and nuances that AI is not prepared to catch.

Stade, in her working paper, notes that while large language models have a “promising” capacity to conduct some of the skills needed for psychotherapy, there’s a difference between “simulating therapy skills” and “implementing them effectively.” She noted specific concerns around how these systems might handle complex cases, including those involving suicidal thoughts, substance abuse, or specific life events.

Honos-Webb gave the example of an older woman who recently developed an eating disorder. One level of treatment might focus specifically on that behavior: If someone isn’t eating, what might help them eat? But a good therapist will pick up on more of that. Over time, that therapist and patient might make the connection between recent life events: Maybe the patient’s husband recently retired. She’s angry because suddenly he’s home all the time, taking up her space.

“So much of therapy is being responsive to emerging context, what you’re seeing, what you’re noticing,” Honos-Webb explained. And the effectiveness of that work is directly tied to the developing relationship between therapist and patient.


Here is my take:

The promise of AI in mental health care dances on a delicate knife's edge. Chatbot therapy, with its alluring accessibility and anonymity, tempts us with a quick fix for the ever-growing burden of mental illness. Yet, as with any powerful tool, its potential can be both a balm and a poison, demanding a wise touch for its ethical wielding.

On the one hand, imagine a world where everyone, regardless of location or circumstance, can find a non-judgmental ear, a gentle guide through the labyrinth of their own minds. Chatbots, tireless and endlessly patient, could offer a first step of support, a bridge to human therapy when needed. In the hushed hours of isolation, they could remind us we're not alone, providing solace and fostering resilience.

But let us not be lulled into a false sense of ease. Technology, however sophisticated, lacks the warmth of human connection, the nuanced understanding of a shared gaze, the empathy that breathes life into words. We must remember that a chatbot can never replace the irreplaceable – the human relationship at the heart of genuine healing.

Therefore, our embrace of chatbot therapy must be tempered with prudence. We must ensure adequate safeguards, preventing them from masquerading as a panacea, neglecting the complex needs of human beings. Transparency is key – users must be aware of the limitations, of the algorithms whispering behind the chatbot's words. Above all, let us never sacrifice the sacred space of therapy for the cold efficiency of code.

Chatbot therapy can be a bridge, a stepping stone, but never the destination. Let us use technology with wisdom, acknowledging its potential good while holding fast to the irreplaceable value of human connection in the intricate tapestry of healing. Only then can we mental health professionals navigate the ethical tightrope and make technology safe and effective, when and where possible.

Friday, January 20, 2023

Teaching Empathy to Mental Health Practitioners and Trainees

Ngo, H., Sokolovic, et al. (2022).
Journal of Consulting and Clinical Psychology,
90(11), 851–860.
https://doi.org/10.1037/ccp0000773

Objective:
Empathy is a foundational therapeutic skill and a key contributor to client outcome, yet the best combination of instructional components for its training is unclear. We sought to address this by investigating the most effective instructional components (didactic, rehearsal, reflection, observation, feedback, mindfulness) and their combinations for teaching empathy to practitioners.

Method: 
Studies included were randomized controlled trials targeted to mental health practitioners and trainees, included a quantitative measure of empathic skill, and were available in English. A total of 36 studies (37 samples) were included (N = 1,616). Two reviewers independently extracted data. Data were pooled by using random-effects pairwise meta-analysis and network meta-analysis (NMA).

Results:
Overall, empathy interventions demonstrated a medium-to-large effect (d = .78, 95% CI [.58, .99]). Pairwise meta-analysis showed that one of the six instructional components was effective: didactic (d = .91 vs. d = .39, p = .02). None of the program characteristics significantly impacted intervention effectiveness (group vs. individual format, facilitator type, number of sessions). No publication bias, risk of bias, or outliers were detected. NMA, which allows for an examination of instructional component combinations, revealed didactic, observation, and rehearsal were included among the most effective components to operate in combination.

Conclusions:
We have identified instructional component, singly (didactic) and in combination (didactic, rehearsal, observation), that provides an efficient way to train empathy in mental health practitioners.

What is the public health significance of this article?

Empathy in mental health practitioners is a core skill associated with positive client outcomes, with evidence that it can be trained. This article provides an aggregation of evidence showing that didactic teaching, as well as trainees observing and practicing the skill, are the elements of training that are most important.

From the Discussion

Despite clear evidence on why empathy should be taught to mental health practitioners and how well empathy interventions work in other professionals, there has been no systematic integration on how best empathy should be taught to those working in mental health. Thus, the present study sought to address this important gap by applying pairwise and network meta-analytic analyses. In effect, we were able to elucidate the efficacious “ingredients” for teaching empathy to mental health practitioners as well as the relative superiority of particular combinations of instructional components. Overall, the effect sizes of empathy interventions were in the moderate to large range (d = .78; 95% CI [.55, .99]), which is comparable to previous meta-analyses of randomized controlled trials (RCTs) of empathy interventions within medical students (d = .68, Fragkos & Crampton, 2020), health care practitioners (d = .80, Kiosses et al., 2016; d = .52, Winter et al., 2020), and mixed trainees (adjusted g = .51; Teding van Berkhout & Malouff, 2016). This effect size means that over 78% of those who underwent empathy training will score above the mean of the control group, a result that clearly supports empathy as a trainable skill. 

Thursday, September 13, 2018

Meet the Chatbots Providing Mental Health Care

Daniela Hernandez
Wall Street Journal
Originally published Aug. 9, 2018

Here is an excerpt:

Wysa Ltd., a London- and Bangalore-based startup, is testing a free chatbot to teach adolescents emotional resilience, said co-founder Ramakant Vempati.  In the app, a chubby penguin named Wysa helps users evaluate the sources of their stress and provides tips on how to stay positive, like thinking of a loved one or spending time outside.  The company said its 400,000 users, most of whom are under 35, have had more than 20 million conversations with the bot.

Wysa is a wellness app, not a medical intervention, Vempati said, but it relies on cognitive behavioral therapy, mindfulness techniques and meditations that are “known to work in a self-help context.”  If a user expresses thoughts of self-harm, Wysa reminds them that it’s just a bot and provides contact information for crisis hotlines.  Alternatively, for $30 a month, users can access unlimited chat sessions with a human “coach.”  Other therapy apps, such as Talkspace, offer similar low-cost services with licensed professionals.

Chatbots have potential, said Beth Jaworski, a mobile apps specialist at the National Center for PTSD in Menlo Park, Calif.  But definitive research on whether they can help patients with more serious conditions, like major depression, still hasn’t been done, in part because the technology is so new, she said.  Clinicians also worry about privacy.  Mental health information is sensitive data; turning it over to companies could have unforeseen consequences.

The article is here.

Friday, May 26, 2017

What is moral injury in veterans?

Holly Arrow and William Schumacher
The Conversation
Originally posted May 21, 2017

Here is an excerpt:

The moral conflict created by the violations of “what’s right” generates moral injury when the inability to reconcile wartime actions with a personal moral code creates lasting psychological consequences.

Psychiatrist Jonathan Shay, in his work with Vietnam veterans, defined moral injury as the psychological, social and physiological results of a betrayal of “what’s right” by an authority in a high-stakes situation. In “Achilles In Vietnam,” a book that examines the psychological devastation of war, a Vietnam veteran described a situation in which his commanding officers used tear gas on a village after the veteran and his unit had their gas masks rendered ineffective due to water damage. The veteran stated, “They gassed us almost to death.” This type of “friendly fire” incident is morally wounding in a way that attacks by an enemy are not.

Psychologist Brett Litz and his colleagues expanded this to include self-betrayal and identified “perpetrating, failing to prevent, bearing witness to, or learning about acts that transgress deeply held moral beliefs and expectations” as the cause of moral injury.

Guilt and moral injury

A research study published in 1991 identified combat-related guilt as the best predictor of suicide attempts among a sample of Vietnam veterans with PTSD. Details of the veterans’ experiences connected that guilt to morally injurious events.

The article is here.

Thursday, April 5, 2012

It’s Too Late to Apologize: Therapist Embarrassment and Shame

By Rebecca Klinger, Nicholas Ladany, and Lauren Kulp
The Counseling Psychologist
For reprints, contact Rebecca Klinger via the hyperlink provided


Abstract
The purpose of this study was to identify events in which therapists felt embarrassment, shame, or both in a therapy session and to investigate the relationship of the embarrassing-shameful events with the therapist reactions. Ninety-three therapists participated in this study, and the most frequent events reported were having a scheduling mistake, forgetting or confusing client information, being visibly tired, falling asleep, and arriving late. Implications and need for further research, particularly concerning the effects of therapist embarrassment and shame on therapy process and outcome, are discussed.

Introduction

Embarrassment and shame are common self-conscious emotions often addressed in the psychotherapy literature (Gilbert, 1997; Leith & Baumeister, 1998; Lewis, 1971; Tangney, 2002; Tracy & Robins, 2004). In fact, exploring the embarrassment and shame felt by clients is frequently an integral part of thetherapeutic process (Gilbert, 1997; Pope, Sonne, & Greene, 2006; Sorotzkin, 1985). Therapist embarrassment and shame, however, have rarely been inves- tigated even though therapist embarrassment and shame are believed to have an important effect on the therapeutic relationship (Pope et al., 2006) and cli- ent outcome (Covert, Tangney, Maddux, & Heleno, 2003; Leith & Baumeister, 1998; Pope et al., 2006). The primary purpose of our study was to identify events in which therapists felt embarrassment, shame, or both in a therapy session and the corresponding reactions of the therapist.


For reprints, contact Rebecca Klinger via the hyperlink provided above.

Thanks to Gary Schoener for this information.