Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Mistakes. Show all posts
Showing posts with label Mistakes. Show all posts

Saturday, July 23, 2022

Concrete Over Abstract: Experimental Evidence of Reflective Equilibrium in Population Ethics

Schoenegger, P. & Grodeck, B. 
(forthcoming). In H. Viciana, F. Aguiar, 
& A. Gaitan (Eds.), Issues in Experimental Moral 
Philosophy. Routledge.

Abstract
One central method of ethics is narrow reflective equilibrium, relating to the conflict between intuitions about general moral principles and intuitions about concrete cases. In these conflicts, general principles are refined, or judgements in concrete chases change to accommodate the until no more conflicts exist. In this paper, we present empirical data on this method in the context of population ethics. We conduct an online experiment (n=543) on Prolific where participants endorse a number of moral principles related to population ethics. They also judge specific population ethical cases that may conflict with their endorsed principles. When conflicts arise, they can choose to revoke the principle, revise their intuition about a case, or continue without having resolved the conflict. We find that participants are significantly more likely to revoke their endorsements of general principles, than their judgements about concrete cases. This evidence suggests that for a lay population, case judgements play a central revisionary role in reflective equilibrium reasoning in the context of population ethics.

Discussion

Our main result is that when participants’ choices result in a conflict between their endorsed abstract principles and their judgements on concrete cases, they prefer to revoke their previously endorsed principle rather than changing or revoking their judgement regarding the concrete population ethical case. Our findings are relevant to theorizing of reflective equilibrium.  Specifically, we take these results to indicate that for lay moral reasoning, case judgements do play a major revisionary role. While we find that some participants want to maintain consistency with the abstract principles, the evidence shows that participants do put more weight on their concrete choices. 

(cut)

As a secondary interest, we also tested whether presenting participants with the abstract principles first and then the concrete cases or the reverse changes their endorsement rates of these principles. We found no statistically significant effects in the Control, Person Affectism, or Pareto, though for both versions of Utilitarianism we did find order effects. This drop in endorsement rates provides further evidence for the case above that once participants are presented with some concrete cases that they can form judgements on, they are less likely to endorse the principles (and if they already endorsed them, more likely to revoke their endorsement). This adds to both the literature on order effects in social psychology and experimental philosophy, as well as to our understanding of folk utilitarian morality.

Thursday, March 19, 2020

Responding to Unprofessional Behavior by Trainees — A “Just Culture” Framework

J. A. Wasserman, M. Redinger, and T. Gibb
New England Journal of Medicine
February 20, 2020
doi: 10.1056/NEJMms1912591

Professionalism lapses by trainees can be addressed productively if viewed through a lens of medical error, drawing on “just culture” principles. With this approach, educators can promote a formative learning environment while fairly addressing problematic behaviors.

Addressing lapses in professionalism is critical to professional development. Yet characterizing the ways in which the behavior of emerging professionals may fall short and responding to those behaviors remain difficult.

Catherine Lucey suggests that we “consider professionalism lapses to be either analogous to or a form of medical error,” in order to create “a ‘just environment’ in which people are encouraged to report professionalism challenges, lapses, and near misses.” Applying a framework of medical error promotes an understanding of professionalism as a set of skills whose acquisition requires a psychologically safe learning environment.

 Lucey and Souba also note that professionalism sometimes requires one to act counter to one’s other interests and motivations (e.g., to subordinate one’s own interests to those of others); the skills required to navigate such dilemmas must be acquired over time, and therefore trainees’ behavior will inevitably sometimes fall short.

We believe that lapses in professional behavior can be addressed productively if we view them through this lens of medical error, drawing on “just culture” principles and related procedural approaches.

(cut)

The Just Culture Approach

Thanks to a movement catalyzed by an Institute of Medicine report, error reduction has become a priority of health systems over the past two decades. Their efforts have involved creating a “culture of psychological safety” that allows for open dialogue, dissent, and transparent reporting. Early iterations involved “blame free” approaches, which have increasingly given way to an emphasis on balancing individual and system accountability.

Drawing on these just culture principles, a popular approach for defining and responding to medical error recognizes the qualitative differences among inadvertent human error, at-risk behavior, and reckless behavior (the Institute for Safe Medication Practices also provides an excellent elaboration of these categories).

“Inadvertent human errors” result from suboptimal individual functioning, but without intention or the knowledge that a behavior is wrong or error-prone (e.g., an anesthesiologist inadvertently grabbing a paralyzing agent instead of a reversal agent). These errors are not considered blameworthy, and proper response involves consolation and assessment of systemic changes to prevent them in the future.

Wednesday, March 4, 2020

How Common Mental Shortcuts Can Cause Major Physician Errors

Anupam B. Jena and Andrew R. Olenski
The New York Times
Originally posted 20 Feb 20

Here is an excerpt:

In health care, such unconscious biases can lead to disparate treatment of patients and can affect whether similar patients live or die.

Sometimes these cognitive biases are simple overreactions to recent events, what psychologists term availability bias. One study found that when patients experienced an unlikely adverse side effect of a drug, their doctor was less likely to order that same drug for the next patient whose condition might call for it, even though the efficacy and appropriateness of the drug had not changed.

A similar study found that when mothers giving birth experienced an adverse event, their obstetrician was more likely to switch delivery modes for the next patient (C-section vs. vaginal delivery), regardless of the appropriateness for that next patient. This cognitive bias resulted in both higher spending and worse outcomes.

Doctor biases don’t affect treatment decisions alone; they can shape the profession as a whole. A recent study analyzed gender bias in surgeon referrals and found that when the patient of a female surgeon dies, the physician who made the referral to that surgeon sends fewer patients to all female surgeons in the future. The study found no such decline in referrals for male surgeons after a patient death.

This list of biases is far from exhaustive, and though they may be disconcerting, uncovering new systematic mistakes is critical for improving clinical practice.

The info is here.

Friday, July 20, 2018

The Psychology of Offering an Apology: Understanding the Barriers to Apologizing and How to Overcome Them

Karina Schumann
Current Directions in Psychological Science 
Vol 27, Issue 2, pp. 74 - 78
First Published March 8, 2018

Abstract

After committing an offense, a transgressor faces an important decision regarding whether and how to apologize to the person who was harmed. The actions he or she chooses to take after committing an offense can have dramatic implications for the victim, the transgressor, and their relationship. Although high-quality apologies are extremely effective at promoting reconciliation, transgressors often choose to offer a perfunctory apology, withhold an apology, or respond defensively to the victim. Why might this be? In this article, I propose three major barriers to offering high-quality apologies: (a) low concern for the victim or relationship, (b) perceived threat to the transgressor’s self-image, and (c) perceived apology ineffectiveness. I review recent research examining how these barriers affect transgressors’ apology behavior and describe insights this emerging work provides for developing methods to move transgressors toward more reparative behavior. Finally, I discuss important directions for future research.

The article is here.

Wednesday, June 13, 2018

The Burnout Crisis in American Medicine

Rena Xu
The Atlantic
Originally published May 11, 2018

Here is an excerpt:

In medicine, burned-out doctors are more likely to make medical errors, work less efficiently, and refer their patients to other providers, increasing the overall complexity (and with it, the cost) of care. They’re also at high risk of attrition: A survey of nearly 7,000 U.S. physicians, published last year in the Mayo Clinic Proceedings, reported that one in 50 planned to leave medicine altogether in the next two years, while one in five planned to reduce clinical hours over the next year. Physicians who self-identified as burned out were more likely to follow through on their plans to quit.

What makes the burnout crisis especially serious is that it is hitting us right as the gap between the supply and demand for health care is widening: A quarter of U.S. physicians are expected to retire over the next decade, while the number of older Americans, who tend to need more health care, is expected to double by 2040. While it might be tempting to point to the historically competitive rates of medical-school admissions as proof that the talent pipeline for physicians won’t run dry, there is no guarantee. Last year, for the first time in at least a decade, the volume of medical school applications dropped—by nearly 14,000, according to data from the Association of American Medical Colleges. By the association’s projections, we may be short 100,000 physicians or more by 2030.

The article is here.

Wednesday, April 18, 2018

Why it’s a bad idea to break the rules, even if it’s for a good cause

Robert Wiblin
80000hours.org
Originally posted March 20, 2018

How honest should we be? How helpful? How friendly? If our society claims to value honesty, for instance, but in reality accepts an awful lot of lying – should we go along with those lax standards? Or, should we attempt to set a new norm for ourselves?

Dr Stefan Schubert, a researcher at the Social Behaviour and Ethics Lab at Oxford University, has been modelling this in the context of the effective altruism community. He thinks people trying to improve the world should hold themselves to very high standards of integrity, because their minor sins can impose major costs on the thousands of others who share their goals.

In addition, when a norm is uniquely important to our situation, we should be willing to question society and come up with something different and hopefully better.

But in other cases, we can be better off sticking with whatever our culture expects, both to save time, avoid making mistakes, and ensure others can predict our behaviour.

The key points and podcast are here.

Friday, March 23, 2018

Mark Zuckerberg Has No Way Out of Facebook's Quagmire

Leonid Bershidsky
Bloomberg News
Originally posted March 21, 2018

Here is an excerpt:

"Making sure time spent on Facebook is time well spent," as Zuckerberg puts it, should lead to the collection of better-quality data. If nobody is setting up fake accounts to spread disinformation, users are more likely to be their normal selves. Anyone analyzing these healthier interactions will likely have more success in targeting commercial and, yes, political offerings to real people. This would inevitably be a smaller yet still profitable enterprise, and no longer a growing one, at least in the short term. But the Cambridge Analytica scandal shows people may not be okay with Facebook's data gathering, improved or not.

The scandal follows the revelation (to most Facebook users who read about it) that, until 2015, application developers on the social network's platform were able to get information about a user's Facebook friends after asking permission in the most perfunctory way. The 2012 Obama campaign used this functionality. So -- though in a more underhanded way -- did Cambridge Analytica, which may or may not have used the data to help elect President Donald Trump.

Many people are angry at Facebook for not acting more resolutely to prevent CA's abuse, but if that were the whole problem, it would have been enough for Zuckerberg to apologize and point out that the offending functionality hasn't been available for several years. The #deletefacebook campaign -- now backed by WhatsApp co-founder Brian Acton, whom Facebook made a billionaire -- is, however, powered by a bigger problem than that. People are worried about the data Facebook is accumulating about them and about how these data are used. Facebook itself works with political campaigns to help them target messages; it did so for the Trump campaign, too, perhaps helping it more than CA did.

The article is here.

First Question: Should you stop using Facebook because they violated your trust?

Second Question: Is Facebook a defective product?

Monday, June 26, 2017

Antecedents and Consequences of Medical Students’ Moral Decision Making during Professionalism Dilemmas

Lynn Monrouxe, Malissa Shaw, and Charlotte Rees
AMA Journal of Ethics. June 2017, Volume 19, Number 6: 568-577.

Abstract

Medical students often experience professionalism dilemmas (which differ from ethical dilemmas) wherein students sometimes witness and/or participate in patient safety, dignity, and consent lapses. When faced with such dilemmas, students make moral decisions. If students’ action (or inaction) runs counter to their perceived moral values—often due to organizational constraints or power hierarchies—they can suffer moral distress, burnout, or a desire to leave the profession. If moral transgressions are rationalized as being for the greater good, moral distress can decrease as dilemmas are experienced more frequently (habituation); if no learner benefit is seen, distress can increase with greater exposure to dilemmas (disturbance). We suggest how medical educators can support students’ understandings of ethical dilemmas and facilitate their habits of enacting professionalism: by modeling appropriate resistance behaviors.

The article is here.