Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Neuroimaging. Show all posts
Showing posts with label Neuroimaging. Show all posts

Thursday, October 26, 2023

The Neuroscience of Trust

Paul J. Zak
Harvard Business Review
Originally posted January-February 2017

Here is an excerpt:

The Return on Trust

After identifying and measuring the managerial behaviors that sustain trust in organizations, my team and I tested the impact of trust on business performance. We did this in several ways. First, we gathered evidence from a dozen companies that have launched policy changes to raise trust (most were motivated by a slump in their profits or market share). Second, we conducted the field experiments mentioned earlier: In two businesses where trust varies by department, my team gave groups of employees specific tasks, gauged their productivity and innovation in those tasks, and gathered very detailed data—including direct measures of brain activity—showing that trust improves performance. And third, with the help of an independent survey firm, we collected data in February 2016 from a nationally representative sample of 1,095 working adults in the U.S. The findings from all three sources were similar, but I will focus on what we learned from the national data since itʼs generalizable.

By surveying the employees about the extent to which firms practiced the eight behaviors, we were able to calculate the level of trust for each organization. (To avoid priming respondents, we never used the word “trust” in surveys.) The U.S. average for organizational trust was 70% (out of a possible 100%). Fully 47% of respondents worked in organizations where trust was below the average, with one firm scoring an abysmally low 15%. Overall, companies scored lowest on recognizing excellence and sharing information (67% and 68%, respectively). So the data suggests that the average U.S. company could enhance trust by
improving in these two areas—even if it didnʼt improve in the other six.

The effect of trust on self-reported work performance was powerful.  Respondents whose companies were in the top quartile indicated they had 106% more energy and were 76% more engaged at work than respondents whose firms were in the bottom quartile. They also reported being 50% more productive
—which is consistent with our objective measures of productivity from studies we have done with employees at work. Trust had a major impact on employee loyalty as well: Compared with employees at low-trust companies, 50% more of those working at high-trust organizations planned to stay with their employer over the next year, and 88% more said they would recommend their company to family and friends as a place to work.


Here is a summary of the key points from the article:
  • Trust is crucial for social interactions and has implications for economic, political, and healthcare outcomes. There are two main types of trust - emotional trust and cognitive trust.
  • Emotional trust develops early in life through attachments and is more implicit, while cognitive trust relies on reasoning and develops later. Both rely on brain regions involved in reward, emotion regulation, understanding others' mental states, and decision making.
  • Oxytocin and vasopressin play key roles in emotional trust by facilitating social bonding and attachment. Disruptions to these systems are linked to social disorders like autism.
  • The prefrontal cortex, amygdala, and striatum are involved in cognitive trust judgments and updating trustworthiness based on new evidence. Damage to prefrontal regions impairs updating of trustworthiness.
  • Trust engages the brain's reward circuitry. Betrayals of trust activate pain and emotion regulation circuits. Trustworthiness cues engage the mentalizing network for inferring others' intentions.
  • Neuroimaging studies show trust engage brain regions involved in reward, emotion regulation, understanding mental states, and decision making. Oxytocin administration increases trusting behavior.
  • Understanding the neuroscience of trust can inform efforts to build trust in healthcare, economic, political, and other social domains. More research is needed on how trust develops over the lifespan.

Monday, March 6, 2023

Cognitive control and dishonesty

Speer, S. P., Smidts, A., & Boksem, M. A. (2022b).
Trends in Cognitive Sciences, 26(9), 796–808.
https://doi.org/10.1016/j.tics.2022.06.005

Abstract

Dishonesty is ubiquitous and imposes substantial financial and social burdens on society. Intuitively, dishonesty results from a failure of willpower to control selfish behavior. However, recent research suggests that the role of cognitive control in dishonesty is more complex. We review evidence that cognitive control is not needed to be honest or dishonest per se, but that it depends on individual differences in what we call one’s ‘moral default’: for those who are prone to dishonesty, cognitive control indeed aids in being honest, but for those who are already generally honest, cognitive control may help them cheat to occasionally profit from small acts of dishonesty. Thus, the role of cognitive control in (dis)honesty is to override the moral default.

Significance

The precise role of cognitive control in dishonesty has been debated for many years, but now important strides have been made to resolve this debate.

Recently developed paradigms that allow for investigating dishonesty on the level of the choice rather than on the level of the individual have substantially improved our understanding of the adaptive role of cognitive control in (dis)honesty.

These new paradigms revealed that the role of cognitive control differs across people: for cheaters, it helps them to sometimes be honest, while for those who are generally honest, it allows them to cheat on occasion. Thus, cognitive control is not required for (dis)honesty per se but is required to override one’s moral default to be either honest or to cheat.

Individual differences in moral default are driven by balancing motivation for reward and upholding a moral self-image.

From Concluding remarks

The Will and Grace hypotheses have been debated for quite some time, but recently important strides have been made to resolve this debate. Key elements in this proposed resolution are (i) recognizing that there is heterogeneity between individuals, some default more towards honesty, whereas others have a stronger inclination towards dishonesty; (ii) recognizing that there is heterogeneity within individuals, cheaters can be honest sometimes and honest people do cheat on occasion; and (iii) the development of experimental paradigms that allow dishonesty to be investigated on the level of the choice, rather than only on the level of the individual or the group. These developments have substantially enhanced understanding of the role of cognitive control in (dis)honesty: it is not required for being honest or dishonest per se, but it is required to override one’s moral default to either be honest or to cheat (Figure 1).

These insights open up novel research agendas and offer suggestions as to how to develop interventions to curtail dishonesty. Our review suggests three processes that may be targeted by such interventions: reward seeking, self-referential thinking, and cognitive control. Shaping contexts in ways that are conducive to honesty by targeting these processes may go a long way to increase honesty in everyday behavior.

Friday, January 22, 2021

Brain Scans Confirm There's a Part of You That Remains 'You' Throughout Your Life

Mike McRae
Science Alert
Originally published 27 Nov 20

At the very core of your identity a kernel of self awareness combines memories of the past with the fleeting sensations of the present, and adds a touch of anticipation for the future.

The question of whether this ongoing sense of 'you' is as robust as it feels has intrigued philosophers and psychologists throughout the ages. A new, small psychobiological study weighs in, looking at brain scans to conclude that at least some part of you is indeed consistent as you grow and age.

"In our study, we tried to answer the question of whether we are the same person throughout our lives," says Miguel Rubianes, a neuroscientist from the Complutense University of Madrid.

"In conjunction with the previous literature, our results indicate that there is a component that remains stable while another part is more susceptible to change over time."

Self-continuity forms the very basis of identity. Every time you use the word 'I', you're referring to a thread that stitches a series of experiences into a tapestry of a lifetime, representing a relationship between the self of your youth with one yet to emerge.

Yet identity is more than the sum of its parts. Consider the allegory of Theseus's ship, or the grandfather's axe paradox – a tool that's had its shaft replaced, as well as its head, but is still somehow the same axe that belonged to grandfather.

If our experiences change us, swapping out components of our identity with every heart break and every promotion, every illness and every windfall, can we truly still say we see ourself as the same person today as we were when we were four years old?

You can be forgiven for thinking this sounds more like philosophical navel-gazing than something science can address. But there are perspectives which psychology – and even the wiring of our neurological programming – can flesh out.

Friday, May 25, 2018

The $3-Million Research Breakdown

Jodi Cohen
www.propublica.org
Originally published April 26, 2018

Here is an excerpt:

In December, the university quietly paid a severe penalty for Pavuluri’s misconduct and its own lax oversight, after the National Institute of Mental Health demanded weeks earlier that the public institution — which has struggled with declining state funding — repay all $3.1 million it had received for Pavuluri’s study.

In issuing the rare rebuke, federal officials concluded that Pavuluri’s “serious and continuing noncompliance” with rules to protect human subjects violated the terms of the grant. NIMH said she had “increased risk to the study subjects” and made any outcomes scientifically meaningless, according to documents obtained by ProPublica Illinois.

Pavuluri’s research is also under investigation by two offices in the U.S. Department of Health and Human Services: the inspector general’s office, which examines waste, fraud and abuse in government programs, according to subpoenas obtained by ProPublica Illinois, and the Office of Research Integrity, according to university officials.

The article is here.

Friday, October 28, 2016

How Large Is the Role of Emotion in Judgments of Moral Dilemmas?

Zachary Horne and Derek Powell
PLoS ONE
Originally published: July 6, 2016

Abstract

Moral dilemmas often pose dramatic and gut-wrenching emotional choices. It is now widely accepted that emotions are not simply experienced alongside people’s judgments about moral dilemmas, but that our affective processes play a central role in determining those judgments. However, much of the evidence purporting to demonstrate the connection between people’s emotional responses and their judgments about moral dilemmas has recently been called into question. In the present studies, we reexamined the role of emotion in people’s judgments about moral dilemmas using a validated self-report measure of emotion. We measured participants’ specific emotional responses to moral dilemmas and, although we found that moral dilemmas evoked strong emotional responses, we found that these responses were only weakly correlated with participants’ moral judgments. We argue that the purportedly strong connection between emotion and judgments of moral dilemmas may have been overestimated.

The article is here.

Tuesday, July 14, 2015

‘Ethical responsibility’ or ‘a whole can of worms’

Differences in opinion on incidental finding review and disclosure in neuroimaging research from focus group discussions with participants, parents, IRB members, investigators, physicians and community members

Caitlin Cole, Linda E Petree, John P Phillips, Jody M Shoemaker, Mark Holdsworth, Deborah L Helitzer
J Med Ethics doi:10.1136/medethics-2014-102552

Abstract
Purpose 
To identify the specific needs, preferences and expectations of the stakeholders impacted by returning neuroimaging incidental findings to research participants.

Methods
Six key stakeholder groups were identified to participate in focus group discussions at our active neuroimaging research facility: Participants, Parents of child participants, Investigators, Institutional Review Board (IRB) Members, Physicians and Community Members. A total of 151 subjects attended these discussions. Transcripts were analysed using principles of Grounded Theory and group consensus coding.

Results 
A series of similar and divergent themes were identified across our subject groups. Similarities included beliefs that it is ethical for researchers to disclose incidental findings as it grants certain health and emotional benefits to participants. All stakeholders also recognised the potential psychological and financial risks to disclosure. Divergent perspectives elucidated consistent differences between our ‘Participant’ subjects (Participants, Parents, Community Members) and our ‘Professional’ subjects (IRB Members, Investigators and Physicians). Key differences included (1) what results should be reported, (2) participants’ autonomous right to research information and (3) the perception of the risk–benefit ratio in managing results.

Conclusions 
Understanding the perceived impact on all stakeholders involved in the process of disclosing incidental findings is necessary to determine appropriate research management policy. Our data further demonstrate the challenge of this task as different stakeholders evaluate the balance between risk and benefit related to their unique positions in this process. These findings offer some of the first qualitative insight into the expectations of the diverse stakeholders affected by incidental finding disclosure.

The entire article is here.