Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Tuesday, July 15, 2025

Medical AI and Clinician Surveillance — The Risk of Becoming Quantified Workers

Cohen, I. G., Ajunwa, I., & Parikh, R. B. (2025).
New England Journal of Medicine.
Advance online publication.

Here is an excerpt:

There are several ways in which AI-based monitoring tools designed to benefit patients and clinicians might be used for clinician surveillance. First, ambient AI scribe tools, which transcribe and interpret patient and clinician speech to generate a structured note, have been rapidly adopted with a goal of reducing the burden associated with documentation and improving documentation accuracy. But ambient dictation systems introduce new capabilities for monitoring clinicians. By analyzing speech patterns, sentiment, and content, health care systems could use AI scribes to assess how often clinicians’ recommendations deviate from institutional guidelines.

In addition, these systems could detect “efficiency outliers” — clinicians who spend more time conversing with patients than employers consider ideal, at the expense of conducting new-patient visits or more total visits. Ambient monitoring is especially worrisome, given cases of employers terminating the contracts of physicians who didn’t meet visit-time expectations. Akin to automated quality-improvement dashboards for tracking adherence to chronic-disease–management standards, AI models may generate performance scores on the basis of adherence to scripted protocols, average time spent with each patient, or degree of shared decision making, which could be inferred with the use of linguistic analysis. Even if these metrics are established to support quality-improvement goals, hospitals and health care systems could leverage them for evaluations of clinicians or performance-based reimbursement adjustments.

Here are some thoughts:

This article is important to psychologists as it explores the psychological and ethical ramifications of AI-driven surveillance in healthcare, which parallels concerns in mental health practice. The quantification of clinicians through tools like ambient scribes and communication analytics threatens professional autonomy, potentially leading to burnout, stress, and reduced job satisfaction—key areas of study in occupational and health psychology. Additionally, the tension between algorithmic conformity and individualized care mirrors challenges in therapeutic settings, where standardized protocols may conflict with personalized treatment approaches. Psychologists can contribute expertise in human behavior, workplace dynamics, and ethical frameworks to advocate for balanced AI integration that prioritizes clinician well-being and patient-centered care. The article also highlights equity issues, as surveillance may disproportionately affect marginalized clinicians, aligning with psychology’s focus on systemic inequities.