Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Guilt. Show all posts
Showing posts with label Guilt. Show all posts

Saturday, September 23, 2023

Moral injury in post-9/11 combat-experienced military veterans: A qualitative thematic analysis.

Kalmbach, K. C., Basinger, E. D.,  et al. (2023). 
Psychological Services. Advance online publication.


War zone exposure is associated with enduring negative mental health effects and poorer responses to treatment, in part because this type of trauma can entail crises of conscience or moral injury. Although a great deal of attention has been paid to posttraumatic stress disorder and fear-based physiological aspects of trauma and suffering, comparatively less attention has been given to the morally injurious dimension of trauma. Robust themes of moral injury were identified in interviews with 26 post-9/11 military veterans. Thematic analysis identified 12 themes that were subsumed under four categories reflecting changes, shifts, or ruptures in worldview, meaning making, identity, and relationships. Moral injury is a unique and challenging clinical construct with impacts on the individual as well as at every level of the social ecological system. Recommendations are offered for addressing moral injury in a military population; implications for community public health are noted.

Impact Statement

Military veterans who experienced moral injury—events that violate deeply held moral convictions or beliefs—reported fundamental changes following the morally injurious event (MIE). The MIE ruptured their worldview, or sense of right and wrong, and they struggled to reconcile a prior belief system or identity with their existence post-MIE. Absent a specific evidence-based intervention, clinicians are encouraged to consider adaptations to existing treatment models but to be aware that moral injury often does not respond to treatment as usual for PTSD or adjacent comorbid conditions.

The article is paywalled, with the link noted above.

My addition:

The thematic analysis identified 12 themes related to moral injury, which were grouped into four categories:
  • Changes in worldview: Veterans who experienced moral injury often reported changes in their worldview, such as questioning their beliefs about the world, their place in it, and their own goodness.
  • Changes in meaning making: Veterans who experienced moral injury often struggled to make meaning of their experiences, which could lead to feelings of emptiness, despair, and hopelessness.
  • Changes in identity: Veterans who experienced moral injury often reported changes in their identity, such as feeling like they were no longer the same person they were before the war.
  • Changes in relationships: Veterans who experienced moral injury often reported changes in their relationships with family, friends, and others. They may have felt isolated, misunderstood, or ashamed of their experiences.

Thursday, March 23, 2023

Are there really so many moral emotions? Carving morality at its functional joints

Fitouchi L., André J., & Baumard N.
To appear in L. Al-Shawaf & T. K. Shackelford (Eds.)
The Oxford Handbook of Evolution and the Emotions.
New York: Oxford University Press.


In recent decades, a large body of work has highlighted the importance of emotional processes in moral cognition. Since then, a heterogeneous bundle of emotions as varied as anger, guilt, shame, contempt, empathy, gratitude, and disgust have been proposed to play an essential role in moral psychology.  However, the inclusion of these emotions in the moral domain often lacks a clear functional rationale, generating conflations between merely social and properly moral emotions. Here, we build on (i) evolutionary theories of morality as an adaptation for attracting others’ cooperative investments, and on (ii) specifications of the distinctive form and content of moral cognitive representations. On this basis, we argue that only indignation (“moral anger”) and guilt can be rigorously characterized as moral emotions, operating on distinctively moral representations. Indignation functions to reclaim benefits to which one is morally entitled, without exceeding the limits of justice. Guilt functions to motivate individuals to compensate their violations of moral contracts. By contrast, other proposed moral emotions (e.g. empathy, shame, disgust) appear only superficially associated with moral cognitive contents and adaptive challenges. Shame doesn’t track, by design, the respect of moral obligations, but rather social valuation, the two being not necessarily aligned. Empathy functions to motivate prosocial behavior between interdependent individuals, independently of, and sometimes even in contradiction with the prescriptions of moral intuitions. While disgust is often hypothesized to have acquired a moral role beyond its pathogen-avoidance function, we argue that both evolutionary rationales and psychological evidence for this claim remain inconclusive for now.


In this chapter, we have suggested that a specification of the form and function of moral representations leads to a clearer picture of moral emotions. In particular, it enables a principled distinction between moral and non-moral emotions, based on the particular types of cognitive representations they process. Moral representations have a specific content: they represent a precise quantity of benefits that cooperative partners owe each other, a legitimate allocation of costs and benefits that ought to be, irrespective of whether it is achieved by people’s actual behaviors. Humans intuit that they have a duty not to betray their coalition, that innocent people do not deserve to be harmed, that their partner has a right not to be cheated on. Moral emotions can thus be defined as superordinate programs orchestrating cognition, physiology and behavior in accordance with the specific information encoded in these moral representations.    On this basis, indignation and guilt appear as prototypical moral emotions. Indignation (“moral anger”) is activated when one receives fewer benefits than one deserves, and recruits bargaining mechanisms to enforce the violated moral contract. Guilt, symmetrically, is sensitive to one’s failure to honor one’s obligations toward others, and motivates compensation to provide them the missing benefits they deserve. By contrast, often-proposed “moral” emotions – shame, empathy, disgust – seem not to function to compute distinctively moral representations of cooperative obligations, but serve other, non-moral functions – social status management, interdependence, and pathogen avoidance (Figure 2). 

Friday, December 9, 2022

Neural and Cognitive Signatures of Guilt Predict Hypocritical Blame

Yu, H., Contreras-Huerta, L. S., et al. (2022).
Psychological Science, 0(0).


A common form of moral hypocrisy occurs when people blame others for moral violations that they themselves commit. It is assumed that hypocritical blamers act in this manner to falsely signal that they hold moral standards that they do not really accept. We tested this assumption by investigating the neurocognitive processes of hypocritical blamers during moral decision-making. Participants (62 adult UK residents; 27 males) underwent functional MRI scanning while deciding whether to profit by inflicting pain on others and then judged the blameworthiness of others’ identical decisions. Observers (188 adult U.S. residents; 125 males) judged participants who blamed others for making the same harmful choice to be hypocritical, immoral, and untrustworthy. However, analyzing hypocritical blamers’ behaviors and neural responses shows that hypocritical blame was positively correlated with conflicted feelings, neural responses to moral standards, and guilt-related neural responses. These findings demonstrate that hypocritical blamers may hold the moral standards that they apply to others.

Statement of Relevance

Hypocrites blame other people for moral violations they themselves have committed. Common perceptions of hypocrites assume they are disingenuous and insincere. However, the mental states and neurocognitive processes underlying hypocritical blamers’ behaviors are not well understood. We showed that people who hypocritically blamed others reported stronger feelings of moral conflict during moral decision-making, had stronger neural responses to moral standards in lateral prefrontal cortex, and exhibited more guilt-related neurocognitive processes associated with harming others. These findings suggest that some hypocritical blamers do care about the moral standards they use to condemn other people but sometimes fail to live up to those standards themselves, contrary to the common philosophical and folk perception.


In this study, we developed a laboratory paradigm to precisely quantify hypocritical blame, in which people blame others for committing the same transgressions they committed themselves (Todd, 2019). At the core of this operationalization of hypocrisy is a discrepancy between participants’ moral judgments and their behaviors in a moral decision-making task. Therefore, we measured participants’ choices in an incentivized moral decision-making task that they believed had real impact on their own monetary payoff and painful electric shocks delivered to a receiver. We then compared those choices with moral judgments they made a week later of other people in the same choice context. By comparing participants’ judgments with their own behaviors, we were able to quantify the degree to which they judge other people more harshly for making the same choices they themselves made previously (i.e., hypocritical blame).

Monday, August 22, 2022

Meta-Analysis of Inequality Aversion Estimates

Nunnari, S., & Pozzi, M. (2022).
SSRN Electronic Journal.


Loss aversion is one of the most widely used concepts in behavioral economics. We conduct a large-scale interdisciplinary meta-analysis, to systematically accumulate knowledge from numerous empirical estimates of the loss aversion coefficient reported during the past couple of decades. We examine 607 empirical estimates of loss aversion from 150 articles in economics, psychology, neuroscience, and several other disciplines. Our analysis indicates that the mean loss aversion coefficient is between 1.8 and 2.1. We also document how reported estimates vary depending on the observable characteristics of the study design.


In this paper, we reported the results of a meta-analysis of empirical estimates of the inequality aversion coefficients in models of outcome-based other-regarding preferences `a la Fehr and Schmidt (1999). We conduct both a frequentist analysis (using a multi-level random-effects model) and a Bayesian analysis (using a Bayesian hierarchical model) to provide a “weighted average” for α and β. The results from the two approaches are nearly identical and support the hypothesis of inequality concerns. From the frequentist analysis, we learn that the mean envy coefficient is 0.425 with a 95% confidence interval of [0.244, 0.606]; the mean guilt coefficient is, instead, 0.291 with a 95% confidence interval [0.218, 0.363]. This means that, on average, an individual is willing to spend € 0.41 to increase others’ earnings by €1 when ahead, and € 0.74 to decrease others’ earnings by €1 when behind. The theoretical assumptions α ≥ β and 0 ≤ β < 1 are upheld in our empirical analysis, but we cannot conclude that the disadvantageous inequality coefficient is statistically greater than the coefficient for advantageous inequality. We also observe no correlation between the two parameters.

Monday, July 25, 2022

Morally Exhausted: Why Russian Soldiers are Refusing to Fight in the Unprovoked War on Ukraine

Tіmofеі Rоzhаnskіy
Originally posted 23 July 22

Here is an excerpt:

I Had To Refuse So I Could Stay Alive

Russia’s troops in Ukraine are largely made up of contract soldiers: volunteer personnel who sign fixed-term contracts for service. The range of experience varies. Other units include troops from private military companies like Vagner, or specialized, semiautonomous units overseen by Chechnya’s strongman leader, Ramzan Kadyrov.

The discontent in Kaminsky’s 11th Brigade is not an isolated case, and there are indications that Russian commanders are trying different tactics to keep the problem from spiraling out of control: for example, publicly shaming soldiers who are refusing to fight.

In Buryatia, where the 11th Brigade is based, dozens of personnel have sought legal assistance from local activists, seeking to break their contracts and get out of service in Ukraine, for various reasons.

In the southern Russian town of Budyonnovsk, on the home base for the 205th Cossack Motorized Rifle Brigade, commanders have erected a “wall of shame” with the names, ranks, and photographs of some 300 soldiers who have disobeyed orders in the Ukraine war.

“They forgot their military oaths, the ceremonial promise, their vows of duty to their Fatherland,” the board reads.

In conversations via the Russian social media giant VK, several soldiers from the brigade disputed the circumstances behind their inclusion on the wall of shame. All asked that their names be withheld for fear of further punishment or retaliation by commanders.

“I understand everything, of course. I signed a contract. I’m supposed to be ready for any situation; this war, this special operation,” one soldier wrote. “But I was thinking, I’m still young; at any moment, a piece of shrapnel, a bullet could fly into my head.”

The soldier said he broke his contract and resigned from the brigade before the February 24 invasion, once he realized it was in fact going forward.

“I thought a long time about it and came to the decision. I understood that I had to refuse so I could stay alive,” he said. “I don’t regret it one bit.”

Friday, July 16, 2021

“False positive” emotions, responsibility, and moral character

Anderson, R.A., et al.
Volume 214, September 2021, 104770


People often feel guilt for accidents—negative events that they did not intend or have any control over. Why might this be the case? Are there reputational benefits to doing so? Across six studies, we find support for the hypothesis that observers expect “false positive” emotions from agents during a moral encounter – emotions that are not normatively appropriate for the situation but still trigger in response to that situation. For example, if a person accidentally spills coffee on someone, most normative accounts of blame would hold that the person is not blameworthy, as the spill was accidental. Self-blame (and the guilt that accompanies it) would thus be an inappropriate response. However, in Studies 1–2 we find that observers rate an agent who feels guilt, compared to an agent who feels no guilt, as a better person, as less blameworthy for the accident, and as less likely to commit moral offenses. These attributions of moral character extend to other moral emotions like gratitude, but not to nonmoral emotions like fear, and are not driven by perceived differences in overall emotionality (Study 3). In Study 4, we demonstrate that agents who feel extremely high levels of inappropriate (false positive) guilt (e.g., agents who experience guilt but are not at all causally linked to the accident) are not perceived as having a better moral character, suggesting that merely feeling guilty is not sufficient to receive a boost in judgments of character. In Study 5, using a trust game design, we find that observers are more willing to trust others who experience false positive guilt compared to those who do not. In Study 6, we find that false positive experiences of guilt may actually be a reliable predictor of underlying moral character: self-reported predicted guilt in response to accidents negatively correlates with higher scores on a psychopathy scale.

From the General Discussion

It seems reasonable to think that there would be some benefit to communicating these moral emotions as a signal of character, and to being able to glean information about the character of others from observations of their emotional responses. If a propensity to feel guilt makes it more likely that a person is cooperative and trustworthy, observers would need to discriminate between people who are and are not prone to guilt. Guilt could therefore serve as an effective regulator of moral behavior in others in its role as a reliable signal of good character.  This account is consistent with theoretical accounts of emotional expressions more generally, either in the face, voice, or body, as a route by which observers make inferences about a person’s underlying dispositions (Frank, 1988). Our results suggest that false positive emotional responses specifically may provide an additional, and apparently informative, source of evidence for one’s propensity toward moral emotions and moral behavior.

Wednesday, July 7, 2021

“False positive” emotions, responsibility, and moral character

Anderson, R. A., et al.
Volume 214, September 2021, 104770


People often feel guilt for accidents—negative events that they did not intend or have any control over. Why might this be the case? Are there reputational benefits to doing so? Across six studies, we find support for the hypothesis that observers expect “false positive” emotions from agents during a moral encounter – emotions that are not normatively appropriate for the situation but still trigger in response to that situation. For example, if a person accidentally spills coffee on someone, most normative accounts of blame would hold that the person is not blameworthy, as the spill was accidental. Self-blame (and the guilt that accompanies it) would thus be an inappropriate response. However, in Studies 1–2 we find that observers rate an agent who feels guilt, compared to an agent who feels no guilt, as a better person, as less blameworthy for the accident, and as less likely to commit moral offenses. These attributions of moral character extend to other moral emotions like gratitude, but not to nonmoral emotions like fear, and are not driven by perceived differences in overall emotionality (Study 3). In Study 4, we demonstrate that agents who feel extremely high levels of inappropriate (false positive) guilt (e.g., agents who experience guilt but are not at all causally linked to the accident) are not perceived as having a better moral character, suggesting that merely feeling guilty is not sufficient to receive a boost in judgments of character. In Study 5, using a trust game design, we find that observers are more willing to trust others who experience false positive guilt compared to those who do not. In Study 6, we find that false positive experiences of guilt may actually be a reliable predictor of underlying moral character: self-reported predicted guilt in response to accidents negatively correlates with higher scores on a psychopathy scale.

General discussion

Collectively, our results support the hypothesis that false positive moral emotions are associated with both judgments of moral character and traits associated with moral character. We consistently found that observers use an agent's false positive experience of moral emotions (e.g., guilt, gratitude) to infer their underlying moral character, their social likability, and to predict both their future emotional responses and their future moral behavior. Specifically, we found that observers judge an agent who experienced “false positive” guilt (in response to an accidental harm) as a more moral person, more likeable, less likely to commit future moral infractions, and more trustworthy than an agent who experienced no guilt. Our results help explain the second “puzzle” regarding guilt for accidental actions (Kamtekar & Nichols, 2019). Specifically, one reason that observers may find an accidental agent less blameworthy, and yet still be wary if the agent does not feel guilt, is that such false positive guilt provides an important indicator of that agent's underlying character.

Wednesday, January 13, 2021

Physical Attractiveness in the Legal System

Rod Hollier
The Law Project

When I started looking into this subject, I predicted a person’s physical attractiveness would only have minor advantages. I was wrong.

In fact, I was so wrong, that in one study, the effects of physical attractiveness on judges were so influential, they fined unattractive criminals 304.88% higher than attractive criminals.

Surprising, I know.

Before we proceed, I want to address a few concerns of mine. Firstly, the information that you will read may cause some readers to feel unsettled. This is not my intention. Yes, it is disheartening. But the purpose of this article is to inform lawyers and other decision makers so that they can use the attractiveness bias to their advantage or to counter it.

A second concern of mine is that I don’t want to over-emphasise the attractiveness bias. Judges and jurors are affected by all kinds of cognitive distortions, such as emotive evidence, time of day, remorse of the defendant, socioeconomic status, race, gender, anchoring effect, and the contrast bias.

In the first section of this article, I give a ‘straight-to-the-point’ summary of the research conducted by 27 studies. Next, I enter into greater depth on the attractiveness bias and its effects on judges, jurors, and lawyers. Lastly, I provide research on the attractiveness bias in everyday life. Arguably, the last section is the most interesting.

Thursday, August 13, 2020

Personality and prosocial behavior: A theoretical framework and meta-analysis

Thielmann, I., Spadaro, G., & Balliet, D. (2020).
Psychological Bulletin, 146(1), 30–90.


Decades of research document individual differences in prosocial behavior using controlled experiments that model social interactions in situations of interdependence. However, theoretical and empirical integration of the vast literature on the predictive validity of personality traits to account for these individual differences is missing. Here, we present a theoretical framework that identifies 4 broad situational affordances across interdependent situations (i.e., exploitation, reciprocity, temporal conflict, and dependence under uncertainty) and more specific subaffordances within certain types of interdependent situations (e.g., possibility to increase equality in outcomes) that can determine when, which, and how personality traits should be expressed in prosocial behavior. To test this framework, we meta-analyzed 770 studies reporting on 3,523 effects of 8 broad and 43 narrow personality traits on prosocial behavior in interdependent situations modeled in 6 commonly studied economic games (Dictator Game, Ultimatum Game, Trust Game, Prisoner’s Dilemma, Public Goods Game, and Commons Dilemma). Overall, meta-analytic correlations ranged between −.18 ≤ ρ̂ ≤ .26, and most traits yielding a significant relation to prosocial behavior had conceptual links to the affordances provided in interdependent situations, most prominently the possibility for exploitation. Moreover, for several traits, correlations within games followed the predicted pattern derived from a theoretical analysis of affordances. On the level of traits, we found that narrow and broad traits alike can account for prosocial behavior, informing the bandwidth-fidelity problem. In sum, the meta-analysis provides a theoretical foundation that can guide future research on prosocial behavior and advance our understanding of individual differences in human prosociality.


Individual differences in prosocial behavior have consistently been documented over decades of research using economic games – and personality traits have been shown to account for such individual variation. The present meta-analysis offers an affordance-based theoretical framework that can illuminate which, when, and how personality traits relate to prosocial behavior across various interdependent situations. Specifically, the framework and meta-analysis identify a few situational affordances that form the basis for the expression of certain traits in prosocial behavior. In this regard, the meta-analysis also shows that no single trait is capable to account for individual variation in prosocial behavior across the variety of interdependent situations that individuals may encounter in everyday social interactions.  Rather, individual differences in prosocial behavior are best viewed as a result of traits being expressed in response to certain situational features that influence the affordances involved in interdependent situations. In conclusion, research on individual differences in prosocial behavior – and corresponding trait conceptualizations – should consider the affordances
provided in interdependent situations to allow for a complete understanding of how personality can shape the many aspects of human prosociality.

Friday, June 19, 2020

My Bedside Manner Got Worse During The Pandemic. Here's How I Improved

Shahdabul Faraz
Health Shots
Originally published 16 May 20

Here is an excerpt:

These gestures can be as simple as sitting in a veteran's room for an extra five minutes to listen to World War II stories. Or listening with a young cancer patient to a song by our shared favorite band. Or clutching a sick patient's shoulder and reassuring him that he will see his three daughters again.

These gestures acknowledge a patient's humanity. It gives them some semblance of normalcy in an otherwise difficult period in their lives. Selfishly, that human connection also helps us — the doctors, nurses and other health care providers — deal with the often frustrating nature of our stressful jobs.

Since the start of the pandemic, our bedside interactions have had to be radically different. Against our instincts, and in order to protect our patients and colleagues, we tend to spend only the necessary amount of time in our patients' rooms. And once inside, we try to keep some distance. I have stopped holding my patients' hands. I now try to minimize small talk. No more whimsical conversational detours.

Our interactions now are more direct and short. I have, more than once, felt guilty for how quickly I've left a patient's room. This guilt is worsened, knowing that patients in hospitals don't have family and friends with them now either. Doctors are supposed to be there for our patients, but it's become harder than ever in recent months.

I understand why these changes are needed. As I move through several hospital floors, I could unwittingly transmit the virus if I'm infected and don't know it. I'm relatively young and healthy, so if I get the disease, I will likely recover. But what about my patients? Some have compromised immune systems. Most are elderly and have more than one high-risk medical condition. I could never forgive myself if I gave one of my patients COVID-19.

The info is here.

Friday, April 3, 2020

Treating “Moral” Injuries

Anna Harwood-Gross
Scientific American
Originally posted 24 March 20

Here is an excerpt:

Though PTSD symptoms such as avoidance of reminders of the traumatic event and intrusive thought patterns may also be present in moral injury, they appear to serve different purposes, with PTSD sufferers avoiding fear and moral injury sufferers avoiding shame triggers. Few comparison studies of PTSD and moral injury exist, yet there has been research that indirectly compares the two conditions by differentiating between fear-based and non-fear-based (i.e., moral injury) forms of PTSD, which have been demonstrated to have different neurobiological markers. In the context of the military, there are countless examples of potentially morally injurious events (PMIEs), which can include killing or wounding others, engaging in retribution or disproportionate violence, or failing to save the life of a comrade, child or civilian. The experience of PMIEs has been demonstrated to lead to a larger range of psychological distress symptoms, including higher levels of guilt, anger, shame, depression and social isolation, than those seen in traditional PTSD profiles.

Guilt is difficult to address in therapy and often lingers following standardized PTSD treatment (that is, if the sufferer is able to access therapy). It may, in fact, be a factor in the more than 49 percent of veterans who drop out of evidence-based PTSD treatment or in why, at times, up to 72% of sufferers, despite meaningful improvement in their symptoms, do not actually recover enough after such treatment for their PTSD diagnosis to be removed. Most often, moral injury symptoms that are present in the clinic are addressed through traditional PTSD treatments, with thoughts of guilt and shame treated similarly to other distorted cognitions. When guilt and the events it relates to are treated as “a feeling and not a fact,” as psychologist Lisa Finlay put it in a 2015 paper, there is an attempt to lessen or relieve such emotions while taking a shortcut to avoid experiencing those that are legitimate and reasonable after-wartime activities. Continuing, Finlay stated that “the idea that we might get good, as a profession, at talking people out of guilt following their involvement in traumatic incidents is frighteningly short-sighted in more ways than one.”

The info is here.

Thursday, March 26, 2020

Italian nurse with coronavirus dies by suicide over fear of infecting others

Daniela TrezziYaron Steinbuch
Originally published 25 March 20

A 34-year-old Italian nurse working on the front lines of the coronavirus pandemic took her own life after testing positive for the illness and was terrified that she had infected others, according to a report.

Daniela Trezzi had been suffering “heavy stress” amid fears she was spreading the deadly bug while treating patients at the San Gerardo Hospital in Monza in the hard-hit region of Lombardy, the Daily Mail reported.

She was working in the intensive care unit while under quarantine after being diagnosed with COVID-19, according to the UK news site.

The National Federation of Nurses of Italy expressed its “pain and dismay” over Trezzi’s death, which came as the country’s mounting death toll surged with 743 additional fatalities Tuesday.

“Each of us has chosen this profession for good and, unfortunately, also for bad: we are nurses,” the federation said.

The info is here.

Tuesday, March 10, 2020

Three Unresolved Issues in Human Morality

Jerome Kagan
Perspectives on Psychological Science
First Published March 28, 2018


This article discusses three major, but related, controversies surrounding the idea of morality. Is the complete pattern of features defining human morality unique to this species? How context dependent are moral beliefs and the emotions that often follow a violation of a moral standard? What developmental sequence establishes a moral code? This essay suggests that human morality rests on a combination of cognitive and emotional processes that are missing from the repertoires of other species. Second, the moral evaluation of every behavior, whether by self or others, depends on the agent, the action, the target of the behavior, and the context. The ontogeny of morality, which begins with processes that apes possess but adds language, inference, shame, and guilt, implies that humans are capable of experiencing blends of thoughts and feelings for which no semantic term exists. As a result, conclusions about a person’s moral emotions based only on questionnaires or interviews are limited to this evidence.

From the Summary

The human moral sense appears to contain some features not found in any other animal. The judgment of a behavior as moral or immoral, by self or community, depends on the agent, the action, and the setting. The development of a moral code involves changes in both cognitive and affective processes that are the result of maturation and experience. The ideas in this essay have pragmatic implications for psychological research. If most humans want others to regard them as moral agents, and, therefore, good persons, their answers to questionnaires or to interviewers as well as behaviors in laboratories will tend to conform to their understanding of what the examiner regards as the society’s values. That is why investigators should try to gather evidence on the behaviors that their participants exhibit in their usual settings.

The article is here.

Wednesday, December 4, 2019

Veterans Must Also Heal From Moral Injury After War

Camillo Mac Bica
Originally published Nov 11, 2019

Here are two excerpts:

Humankind has identified and internalized a set of values and norms through which we define ourselves as persons, structure our world and render our relationship to it — and to other human beings — comprehensible. These values and norms provide the parameters of our being: our moral identity. Consequently, we now have the need and the means to weigh concrete situations to determine acceptable (right) and unacceptable (wrong) behavior.

Whether an individual chooses to act rightly or wrongly, according to or in violation of her moral identity, will affect whether she perceives herself as true to her personal convictions and to others in the moral community who share her values and ideals. As the moral gravity of one’s actions and experiences on the battlefield becomes apparent, a warrior may suffer profound moral confusion and distress at having transgressed her moral foundations, her moral identity.

Guilt is, simply speaking, the awareness of having transgressed one’s moral convictions and the anxiety precipitated by a perceived breakdown of one’s ethical cohesion — one’s integrity — and an alienation from the moral community. Shame is the loss of self-esteem consequent to a failure to live up to personal and communal expectations.


Having completed the necessary philosophical and psychological groundwork, veterans can now begin the very difficult task of confronting the experience. That is, of remembering, reassessing and morally reevaluating their responsibility and culpability for their perceived transgressions on the battlefield.

Reassessing their behavior in combat within the parameters of their increased philosophical and psychological awareness, veterans realize that the programming to which they were subjected and the experience of war as a survival situation are causally connected to those specific battlefield incidents and behaviors, theirs and/or others’, that weigh heavily on their consciences — their moral injury. As a consequence, they understand these influences as extenuating circumstances.

Finally, as they morally reevaluate their actions in war, they see these incidents and behaviors in combat not as justifiable, but as understandable, perhaps even excusable, and their culpability mitigated by the fact that those who determined policy, sent them to war, issued the orders, and allowed the war to occur and/or to continue unchallenged must share responsibility for the crimes and horror that inevitably characterize war.

The info is here.

Saturday, June 1, 2019

Does It Matter Whether You or Your Brain Did It?

Uri Maoz, K. R. Sita, J. J. A. van Boxtel, and L. Mudrik
Front. Psychol., 30 April 2019


Despite progress in cognitive neuroscience, we are still far from understanding the relations between the brain and the conscious self. We previously suggested that some neuroscientific texts that attempt to clarify these relations may in fact make them more difficult to understand. Such texts—ranging from popular science to high-impact scientific publications—position the brain and the conscious self as two independent, interacting subjects, capable of possessing opposite psychological states. We termed such writing ‘Double Subject Fallacy’ (DSF). We further suggested that such DSF language, besides being conceptually confusing and reflecting dualistic intuitions, might affect people’s conceptions of moral responsibility, lessening the perception of guilt over actions. Here, we empirically investigated this proposition with a series of three experiments (pilot and two preregistered replications). Subjects were presented with moral scenarios where the defendant was either (1) clearly guilty, (2) ambiguous, or (3) clearly innocent while the accompanying neuroscientific evidence about the defendant was presented using DSF or non-DSF language. Subjects were instructed to rate the defendant’s guilt in all experiments. Subjects rated the defendant in the clearly guilty scenario as guiltier than in the two other scenarios and the defendant in the ambiguously described scenario as guiltier than in the innocent scenario, as expected. In Experiment 1 (N = 609), an effect was further found for DSF language in the expected direction: subjects rated the defendant less guilty when the neuroscientific evidence was described using DSF language, across all levels of culpability. However, this effect did not replicate in Experiment 2 (N = 1794), which focused on different moral scenario, nor in Experiment 3 (N = 1810), which was an exact replication of Experiment 1. Bayesian analyses yielded strong evidence against the existence of an effect of DSF language on the perception of guilt. Our results thus challenge the claim that DSF language affects subjects’ moral judgments. They further demonstrate the importance of good scientific practice, including preregistration and—most critically—replication, to avoid reaching erroneous conclusions based on false-positive results.

Monday, May 6, 2019

How do we make moral decisions?

Dartmouth College
Press Release
Originally released April 18, 2019

When it comes to making moral decisions, we often think of the golden rule: do unto others as you would have them do unto you. Yet, why we make such decisions has been widely debated. Are we motivated by feelings of guilt, where we don't want to feel bad for letting the other person down? Or by fairness, where we want to avoid unequal outcomes? Some people may rely on principles of both guilt and fairness and may switch their moral rule depending on the circumstances, according to a Radboud University - Dartmouth College study on moral decision-making and cooperation. The findings challenge prior research in economics, psychology and neuroscience, which is often based on the premise that people are motivated by one moral principle, which remains constant over time. The study was published recently in Nature Communications.

"Our study demonstrates that with moral behavior, people may not in fact always stick to the golden rule. While most people tend to exhibit some concern for others, others may demonstrate what we have called 'moral opportunism,' where they still want to look moral but want to maximize their own benefit," said lead author Jeroen van Baar, a postdoctoral research associate in the department of cognitive, linguistic and psychological sciences at Brown University, who started this research when he was a scholar at Dartmouth visiting from the Donders Institute for Brain, Cognition and Behavior at Radboud University.

"In everyday life, we may not notice that our morals are context-dependent since our contexts tend to stay the same daily. However, under new circumstances, we may find that the moral rules we thought we'd always follow are actually quite malleable," explained co-author Luke J. Chang, an assistant professor of psychological and brain sciences and director of the Computational Social Affective Neuroscience Laboratory (Cosan Lab) at Dartmouth. "This has tremendous ramifications if one considers how our moral behavior could change under new contexts, such as during war," he added.

The info is here.

The research is here.

Friday, November 16, 2018

Motivated misremembering: Selfish decisions are more generous in hindsight

Ryan Carlson, Michel Marechal, Bastiaan Oud, Ernst Fehr, & Molly Crockett
Created on: July 22, 2018 | Last edited: July 22, 2018


People often prioritize their own interests, but also like to see themselves as moral. How do individuals resolve this tension? One way to both maximize self-interest and maintain a moral self-image is to misremember the extent of one’s selfishness. Here, we tested this possibility. Across three experiments, participants decided how to split money with anonymous partners, and were later asked to recall their decisions. Participants systematically recalled being more generous in the past than they actually were, even when they were incentivized to recall accurately. Crucially, this effect was driven by individuals who gave less than what they personally believed was fair, independent of how objectively selfish they were. Our findings suggest that when people’s actions fall short of their own personal standards, they may misremember the extent of their selfishness, thereby warding off negative emotions and threats to their moral self-image.

Significance statement

Fairness is widely endorsed in human societies, but less often practiced. Here we demonstrate how memory distortions may contribute to this discrepancy. Across three experiments (N = 1005), we find that people consistently remember being more generous in the past than they actually were. We show that this effect occurs specifically for individuals whose decisions fell below their own fairness standards, irrespective of how high or low those standards were. These findings suggest that when people perceive their own actions as selfish, they can remember having acted more equitably, thus minimizing guilt and preserving their self-image.

The research is here.

Monday, July 9, 2018

Learning from moral failure

Matthew Cashman & Fiery Cushman
In press: Becoming Someone New: Essays on Transformative Experience, Choice, and Change


Pedagogical environments are often designed to minimize the chance of people acting wrongly; surely this is a sensible approach. But could it ever be useful to design pedagogical environments to permit, or even encourage, moral failure? If so, what are the circumstances where moral failure can be beneficial?  What types of moral failure are helpful for learning, and by what mechanisms? We consider the possibility that moral failure can be an especially effective tool in fostering learning. We also consider the obvious costs and potential risks of allowing or fostering moral failure. We conclude by suggesting research directions that would help to establish whether, when and how moral pedagogy might be facilitated by letting students learn from moral failure.



Errors are an important source of learning, and educators often exploit this fact.  Failing helps to tune our sense of balance; Newtonian mechanics sticks better when we witness the failure of our folk physics. We consider the possibility that moral failure may also prompt especially strong or distinctive forms of learning.  First, and with greatest certainty, humans are designed to learn from moral failure through the feeling of guilt.  Second, and more speculatively, humans may be designed to experience moral failures by “testing limits” in a way that ultimately fosters an adaptive moral character.  Third—and highly speculatively—there may be ways to harness learning by moral failure in pedagogical contexts. Minimally, this might occur by imagination, observational learning, or the exploitation of spontaneous wrongful acts as “teachable moments”.

The book chapter is here.

Thursday, April 5, 2018

Moral Injury and Religiosity in US Veterans With Posttraumatic Stress Disorder Symptoms

Harold Koenig and others
The Journal of Nervous and Mental Disease: February 28, 2018


Moral injury (MI) involves feelings of shame, grief, meaninglessness, and remorse from having violated core moral beliefs related to traumatic experiences. This multisite cross-sectional study examined the association between religious involvement (RI) and MI symptoms, mediators of the relationship, and the modifying effects of posttraumatic stress disorder (PTSD) severity in 373 US veterans with PTSD symptoms who served in a combat theater. Assessed were demographic, military, religious, physical, social, behavioral, and psychological characteristics using standard measures of RI, MI symptoms, PTSD, depression, and anxiety. MI was widespread, with over 90% reporting high levels of at least one MI symptom and the majority reporting at least five symptoms or more. In the overall sample, religiosity was inversely related to MI in bivariate analyses (r = −0.25, p < 0.0001) and multivariate analyses (B = −0.40, p = 0.001); however, this relationship was present only among veterans with severe PTSD (B = −0.65, p = 0.0003). These findings have relevance for the care of veterans with PTSD.

The paper is here.

Friday, May 26, 2017

What is moral injury in veterans?

Holly Arrow and William Schumacher
The Conversation
Originally posted May 21, 2017

Here is an excerpt:

The moral conflict created by the violations of “what’s right” generates moral injury when the inability to reconcile wartime actions with a personal moral code creates lasting psychological consequences.

Psychiatrist Jonathan Shay, in his work with Vietnam veterans, defined moral injury as the psychological, social and physiological results of a betrayal of “what’s right” by an authority in a high-stakes situation. In “Achilles In Vietnam,” a book that examines the psychological devastation of war, a Vietnam veteran described a situation in which his commanding officers used tear gas on a village after the veteran and his unit had their gas masks rendered ineffective due to water damage. The veteran stated, “They gassed us almost to death.” This type of “friendly fire” incident is morally wounding in a way that attacks by an enemy are not.

Psychologist Brett Litz and his colleagues expanded this to include self-betrayal and identified “perpetrating, failing to prevent, bearing witness to, or learning about acts that transgress deeply held moral beliefs and expectations” as the cause of moral injury.

Guilt and moral injury

A research study published in 1991 identified combat-related guilt as the best predictor of suicide attempts among a sample of Vietnam veterans with PTSD. Details of the veterans’ experiences connected that guilt to morally injurious events.

The article is here.