Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Monday, September 17, 2018

Who Is Experiencing What Kind of Moral Distress?

Carina Fourie
AMA J Ethics. 2017;19(6):578-584.

Abstract

Moral distress, according to Andrew Jameton’s highly influential definition, occurs when a nurse knows the morally correct action to take but is constrained in some way from taking this action. The definition of moral distress has been broadened, first, to include morally challenging situations that give rise to distress but which are not necessarily linked to nurses feeling constrained, such as those associated with moral uncertainty. Second, moral distress has been broadened so that it is not confined to the experiences of nurses. However, such a broadening of the concept does not mean that the kind of moral distress being experienced, or the role of the person experiencing it, is morally irrelevant. I argue that differentiating between categories of distress—e.g., constraint and uncertainty—and between groups of health professionals who might experience moral distress is potentially morally relevant and should influence the analysis, measurement, and amelioration of moral distress in the clinic.

The info is here.

How our lives end must no longer be a taboo subject

Kathryn Mannix
The Guardian
Originally published August 16, 2018

Here is an excerpt:

As we age and develop long-term health conditions, our chances of becoming suddenly ill rise; prospects for successful resuscitation fall; our youthful assumptions about length of life may be challenged; and our quality of life becomes increasingly more important to us than its length. The number of people over the age of 85 will double in the next 25 years, and dementia is already the biggest cause of death in this age group. What discussions do we need to have, and to repeat at sensible intervals, to ensure that our values and preferences are understood by the people who may be asked about them?

Our families need to know our answers to such questions as: how much treatment is too much or not enough? Do we see artificial hydration and nutrition as “treatment” or as basic care? Is life at any cost or quality of life more important to us? And what gives us quality of life? A 30-year-old attorney may not understand that being able to hear birdsong, or enjoy ice-cream, or follow the racing results, is more important to a family’s 85-year-old relative than being able to walk or shop. When we are approaching death, what important things should our carers know about us?

The info is here.

Sunday, September 16, 2018

Time to abandon grand ethical theories?

Julian Baggini
TheTLS.co
Originally posted May 22, 2018

Here are two excerpts:

Social psychologists, sociologists and anthropologists would not be baffled by this apparent contradiction. Many have long believed that morality is essentially a system of social regulation. As such it is in no more need of a divine foundation or a philosophical justification than folk dancing or tribal loyalty. Indeed, if ethics is just the management of the social sphere, it should not be surprising that as we live in a more globalized world, ethics becomes enlarged to encompass not only how we treat kith and kin but our distant neighbours too.

Philosophers have more to worry about. They are not generally satisfied to see morality as a purely pragmatic means of keeping the peace. To see the world muddling through morality is deeply troubling. Where’s the consistency? Where’s the theoretical framework? Where’s the argument?

(cut)

There is then a curious combination of incoherence and vagueness about just what it is to be ethical, and a bogus precision in the ways in which organizations prove themselves to be good. All this confusion helps fuel philosophical ethics, which has become a vibrant, thriving discipline, providing academic presses with a steady stream of books. Looking over a sample of their recent output, it is evident that moral philosophers are keen to show that they are not just playing intellectual games and that they have something to offer the world.

The info is here.

Saturday, September 15, 2018

Social Science One And How Top Journals View The Ethics Of Facebook Data Research

Kalev Leetaru
Forbes.com
Originally posted on August 13, 2018

Here is an excerpt:

At the same time, Social Science One’s decision to leave all ethical questions TBD and to eliminate the right to informed consent or the ability to opt out of research fundamentally redefines what it means to conduct research in the digital era, normalizing the removal of these once sacred ethical tenets. Given the refusal of one of its committee members to provide replication data for his own study and the statement by another committee member that “I have articulated the argument that ToS are not, and should not be considered, ironclad rules binding the activities of academic researchers. … I don't think researchers should reasonably be expected to adhere to such conditions, especially at a time when officially sanctioned options for collecting social media data are disappearing left and right,” the result is an ethically murky landscape in which it is unclear just where Social Science One draws the line at what it will or will not permit.

Given Facebook’s new focus on “privacy first” I asked the company whether it would commit to offering its two billion users a new profile setting allowing them to opt out of having their data made available to academic researchers such as Social Science One. As it has repeatedly done in the past, the company declined to comment.

The info is here.

Friday, September 14, 2018

Law, Ethics, and Conversations between Physicians and Patients about Firearms in the Home

Alexander D. McCourt, and Jon S. Vernick
AMA J Ethics. 2018;20(1):69-76.

Abstract

Firearms in the home pose a risk to household members, including homicide, suicide, and unintentional deaths. Medical societies urge clinicians to counsel patients about those risks as part of sound medical practice. Depending on the circumstances, clinicians might recommend safe firearm storage, temporary removal of the firearm from the home, or other measures. Certain state firearm laws, however, might present legal and ethical challenges for physicians who counsel patients about guns in the home. Specifically, we discuss state background check laws for gun transfers, safe gun storage laws, and laws forbidding physicians from engaging in certain firearm-related conversations with their patients. Medical professionals should be aware of these and other state gun laws but should offer anticipatory guidance when clinically appropriate.

The info is here.

What Are “Ethics in Design”?

Victoria Sgarro
slate.com
Originally posted August 13, 2018

Here is an excerpt:

As a product designer, I know that no mandate exists to integrate these ethical checks and balances in our process. While I may hear a lot of these issues raised at speaking events and industry meetups, more “practical” considerations can overshadow these conversations in my day-to-day decision making. When they have to compete with the workaday pressures of budgets, roadmaps, and clients, these questions won’t emerge as priorities organically.

Most important, then, is action. Castillo worries that the conversation about “ethics in design” could become a cliché, like “empathy” or “diversity” in tech, where it’s more talk than walk. She says it’s not surprising that ethics in tech hasn’t been addressed in depth in the past, given the industry’s lack of diversity. Because most tech employees come from socially privileged backgrounds, they may not be as attuned to ethical concerns. A designer who identifies with society’s dominant culture may have less personal need to take another perspective. Indeed, identification with a society’s majority is shown to be correlated with less critical awareness of the world outside of yourself. Castillo says that, as a black woman in America, she’s a bit wary of this conversation’s effectiveness if it remains only a conversation.

“You know how someone says, ‘Why’d you become a nurse or doctor?’ And they say, ‘I want to help people’?” asks Castillo. “Wouldn’t it be cool if someone says, ‘Why’d you become an engineer or a product designer?’ And you say, ‘I want to help people.’ ”

The info is here.

Thursday, September 13, 2018

How Should Clinicians Respond to Requests from Patients to Participate in Prayer?

A. R. Christensen, T. E. Cook, and R. M. Arnold
AMA J Ethics. 2018;20(7):E621-629.

Abstract

Over the past 20 years, physicians have shifted from viewing a patient’s request for prayer as a violation of professional boundaries to a question deserving nuanced understanding of the patient’s needs and the clinician’s boundaries. In this case, Mrs. C’s request for prayer can reflect religious distress, anxiety about her clinical circumstances, or a desire to better connect with her physician. These different needs suggest that it is important to understand the request before responding. To do this well requires that Dr. Q not be emotionally overwhelmed by the request and that she has skill in discerning potential reasons for the request.

The info is here.

Meet the Chatbots Providing Mental Health Care

Daniela Hernandez
Wall Street Journal
Originally published Aug. 9, 2018

Here is an excerpt:

Wysa Ltd., a London- and Bangalore-based startup, is testing a free chatbot to teach adolescents emotional resilience, said co-founder Ramakant Vempati.  In the app, a chubby penguin named Wysa helps users evaluate the sources of their stress and provides tips on how to stay positive, like thinking of a loved one or spending time outside.  The company said its 400,000 users, most of whom are under 35, have had more than 20 million conversations with the bot.

Wysa is a wellness app, not a medical intervention, Vempati said, but it relies on cognitive behavioral therapy, mindfulness techniques and meditations that are “known to work in a self-help context.”  If a user expresses thoughts of self-harm, Wysa reminds them that it’s just a bot and provides contact information for crisis hotlines.  Alternatively, for $30 a month, users can access unlimited chat sessions with a human “coach.”  Other therapy apps, such as Talkspace, offer similar low-cost services with licensed professionals.

Chatbots have potential, said Beth Jaworski, a mobile apps specialist at the National Center for PTSD in Menlo Park, Calif.  But definitive research on whether they can help patients with more serious conditions, like major depression, still hasn’t been done, in part because the technology is so new, she said.  Clinicians also worry about privacy.  Mental health information is sensitive data; turning it over to companies could have unforeseen consequences.

The article is here.

Wednesday, September 12, 2018

How Could Commercial Terms of Use and Privacy Policies Undermine Informed Consent in the Age of Mobile Health?

Cynthia E. Schairer, Caryn Kseniya Rubanovich, and Cinnamon S. Bloss
AMA J Ethics. 2018;20(9):E864-872.

Abstract

Granular personal data generated by mobile health (mHealth) technologies coupled with the complexity of mHealth systems creates risks to privacy that are difficult to foresee, understand, and communicate, especially for purposes of informed consent. Moreover, commercial terms of use, to which users are almost always required to agree, depart significantly from standards of informed consent. As data use scandals increasingly surface in the news, the field of mHealth must advocate for user-centered privacy and informed consent practices that motivate patients’ and research participants’ trust. We review the challenges and relevance of informed consent and discuss opportunities for creating new standards for user-centered informed consent processes in the age of mHealth.

The info is here.