Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Moral Emotions. Show all posts
Showing posts with label Moral Emotions. Show all posts

Wednesday, July 12, 2017

Emotion shapes the diffusion of moralized content in social networks

William J. Brady, Julian A. Wills, John T. Jost, Joshua A. Tucker, and Jay J. Van Bavel
PNAS 2017 ; published ahead of print June 26, 2017

Abstract

Political debate concerning moralized issues is increasingly common in online social networks. However, moral psychology has yet to incorporate the study of social networks to investigate processes by which some moral ideas spread more rapidly or broadly than others. Here, we show that the expression of moral emotion is key for the spread of moral and political ideas in online social networks, a process we call “moral contagion.” Using a large sample of social media communications about three polarizing moral/political issues (n = 563,312), we observed that the presence of moral-emotional words in messages increased their diffusion by a factor of 20% for each additional word. Furthermore, we found that moral contagion was bounded by group membership; moral-emotional language increased diffusion more strongly within liberal and conservative networks, and less between them. Our results highlight the importance of emotion in the social transmission of moral ideas and also demonstrate the utility of social network methods for studying morality. These findings offer insights into how people are exposed to moral and political ideas through social networks, thus expanding models of social influence and group polarization as people become increasingly immersed in social media networks.

The research is here.

Monday, June 19, 2017

The behavioral and neural basis of empathic blame

Indrajeet Patil, Marta Calò, Federico Fornasier, Fiery Cushman, Giorgia Silani
Forthcoming in Scientific Reports

Abstract

Mature moral judgments rely both on a perpetrator’s intent to cause harm, and also on the actual harm caused—even when unintended. Much prior research asks how intent information is represented neurally, but little asks how even unintended harms influence judgment. We interrogate the psychological and neural basis of this process, focusing especially on the role of empathy for the victim of a harmful act. Using fMRI, we found that the ‘empathy for pain’ network was involved in encoding harmful outcomes and integrating harmfulness information for different types of moral judgments, and individual differences in the extent to which this network was active during encoding and integration of harmfulness information determined severity of moral judgments. Additionally, activity in the network was down-regulated for acceptability, but not blame, judgments for accidental harm condition, suggesting that these two types of moral evaluations are neurobiologically dissociable. These results support a model of “empathic blame”, whereby the perceived suffering of a victim colors moral judgment of an accidental harmdoer.

The paper is here.

Sunday, February 26, 2017

The Disunity of Morality

Walter Sinnott-Armstrong
In Moral Brains: The Neuroscience of Morality

Here is an excerpt:

What Is the Issue?

The question is basically whether morality is like memory. Once upon a time, philosophers and psychologists believed that memory is monolithic. Now memory is understood as a group of distinct phenomena that need to be studied separately (Tulving 2000). Memory includes not only semantic or declarative memory, such as remembering that a bat is a mammal, but also episodic memory, such as remembering seeing a bat yesterday. Memories can also be long-term or short-term (or working) memory, and procedural memory includes remembering how to do things, such as how to ride a bike.

Thus, there are many kinds of memory, and they are not unified by any common and distinctive feature. They are not even all about the past, since you can also remember timeless truths, such as that pi is 3.14159 …, and you can also remember that you have a meeting tomorrow, even if you do not remember setting up the meeting or even who set it up. These kinds of memory differ not only in their psychological profiles and functions but also in their neural basis, as shown by both fMRI and by patients, such as H. M., whose brain lesions left him with severely impaired episodic memory but largely intact procedural and semantic memory. Such findings led most experts to accept that memory
is not unified.

This recognition enabled progress. Neuroscientists could never find a neural basis for memory as such while they lumped together all kinds of memory. Psychologists could never formulate reliable generalizations about memory as long as they failed to distinguish kinds of memories. And philosophers could never settle how memory is justified if they conflated remembering facts and remembering how to ride a bicycle. Although these problems remain hard, progress became easier after recognizing that memory is not a single natural kind.

My thesis is that morality is like memory. Neither of them is unified, and admitting disunity makes progress possible in both areas. Moral neuroscience, psychology, and philosophy will become much more precise and productive if they give up the assumption that moral judgments all share a distinctive essence.

The book chapter is here.

Thursday, September 8, 2016

How Emotions Shape Moral Behavior: Some Answers (and Questions) for the Field of Moral Psychology

Teper R., Zhong C.-B., and Inzlicht M.
Social and Personality Psychology Compass (2015), 9, 1–14

Abstract

Within the past decade, the field of moral psychology has begun to disentangle the mechanics behind moral judgments, revealing the vital role that emotions play in driving these processes. However, given the well-documented dissociation between attitudes and behaviors, we propose that an equally important issue is how emotions inform actual moral behavior – a question that has been relatively ignored up until recently. By providing a review of recent studies that have begun to explore how emotions drive actual moral behavior, we propose that emotions are instrumental in fueling real-life moral actions. Because research examining the role of emotional processes on moral behavior is currently limited, we push for the use of behavioral measures in the field in the hopes of building a more complete theory of real-life moral behavior.

The article is here.

Monday, August 22, 2016

Rationalizing our Way into Moral Progress

Jesse S. Summers
Ethical Theory and Moral Practice:1-12 (forthcoming)

Research suggests that the explicit reasoning we offer to ourselves and to others is often rationalization, that we act instead on instincts, inclinations, stereotypes, emotions, neurobiology, habits, reactions, evolutionary pressures, unexamined principles, or justifications other than the ones we think we’re acting on, then we tell a post hoc story to justify our actions. This is troubling for views of moral progress according to which moral progress proceeds from our engagement with our own and others’ reasons. I consider an account of rationalization, based on Robert Audi’s, to make clear that rationalization, unlike simple lying, can be sincere. Because it can be sincere, and because we also have a desire to be consistent with ourselves, I argue that rationalization sets us up for becoming better people over time, and that a similar case can be made to explain how moral progress among groups of people can proceed via rationalization.

Tuesday, July 26, 2016

How Large Is the Role of Emotion in Judgments of Moral Dilemmas?

Horne Z, Powell D (2016)
PLoS ONE 11(7): e0154780.
doi: 10.1371/journal.pone.0154780

Abstract

Moral dilemmas often pose dramatic and gut-wrenching emotional choices. It is now widely
accepted that emotions are not simply experienced alongside people’s judgments about
moral dilemmas, but that our affective processes play a central role in determining those
judgments. However, much of the evidence purporting to demonstrate the connection
between people’s emotional responses and their judgments about moral dilemmas has
recently been called into question. In the present studies, we reexamined the role of emotion
in people’s judgments about moral dilemmas using a validated self-report measure of
emotion. We measured participants’ specific emotional responses to moral dilemmas and,
although we found that moral dilemmas evoked strong emotional responses, we found that
these responses were only weakly correlated with participants’ moral judgments. We argue
that the purportedly strong connection between emotion and judgments of moral dilemmas
may have been overestimated.

The article is here.

Sunday, July 3, 2016

Disgust made us human

By Kathleen McAuliffe
Aeon
Originally posted June 6, 2016

Here are two excerpts:

If you’re skeptical that parasites have any bearing on your principles, consider this: our values actually change when there are infectious agents in our vicinity. In an experiment by Simone Schnall, a social psychologist at the University of Cambridge, students were asked to ponder morally questionable behaviour such as lying on a résumé, not returning a stolen wallet or, far more fraught, turning to cannibalism to survive a plane crash. Subjects seated at desks with food stains and chewed-up pens typically judged these transgressions as more egregious than students at spotless desks. Numerous other studies – using, unbeknown to the participants, imaginative disgust elicitors such as fart spray or the scent of vomit – have reported similar findings. Premarital sex, bribery, pornography, unethical journalism, marriage between first cousins: all become more reprehensible when subjects were disgusted.

(cut)

From this point in human social development, it took a bit more rejiggering of the same circuitry to bring our species to a momentous place: we became disgusted by people who behaved immorally. This development, Curtis argues, is central to understanding how we became an extraordinarily social and cooperative species, capable of putting our minds together to solve problems, create new inventions, exploit natural resources with unprecedented efficiency and, ultimately, lay the foundations for civilisation.

The article is here.

Editor's note: Please, if you can make it past the dog rape example in the beginning of the article, it is a thought provoking article.  Go to the "comments" section to see what readers have to say about that example.

Sunday, May 22, 2016

Is Deontology a Moral Confabulation?

Emilian Mihailov
Neuroethics
April 2016, Volume 9, Issue 1, pp 1-13

Abstract

Joshua Greene has put forward the bold empirical hypothesis that deontology is a confabulation of moral emotions. Deontological philosophy does not stem from "true" moral reasoning, but from emotional reactions, backed up by post hoc rationalizations which play no role in generating the initial moral beliefs. In this paper, I will argue against the confabulation hypothesis. First, I will highlight several points in Greene’s discussion of confabulation, and identify two possible models. Then, I will argue that the evidence does not illustrate the relevant model of deontological confabulation. In fact, I will make the case that deontology is unlikely to be a confabulation because alarm-like emotions, which allegedly drive deontological theorizing, are resistant to be subject to confabulation. I will end by clarifying what kind of claims can the confabulation data support. The upshot of the final section is that confabulation data cannot be used to undermine deontological theory in itself, and ironically, if one commits to the claim that a deontological justification is a confabulation in a particular case, then the data suggests that in general deontology has a prima facie validity.

The article is here.

Friday, April 22, 2016

Review: Eric Fair’s ‘Consequence,’ a Memoir by a Former Abu Ghraib Interrogator

By Michiko Kakutani
New York Times Book Review
Originally published April 4, 2016

Here is an excerpt:

Of the Abu Ghraib torture photos broadcast by “60 Minutes” in April 2004, Mr. Fair writes: “Some of the activities in the photographs are familiar to me. Others are not. But I am not shocked. Neither is anyone else who served at Abu Ghraib. Instead, we are shocked by the performance of the men who stand behind microphones and say things like ‘bad apples’ and ‘Animal House’ on night shift.’”

In 2007, Mr. Fair says, he confessed everything to a lawyer from the Department of Justice and two agents from the Army’s Criminal Investigation Command, providing pictures, letters, names, firsthand accounts, locations and techniques. He was not prosecuted. “We tortured people the right way,” he writes, “following the right procedures, and used the approved techniques.”

Mr. Fair, however, became increasingly racked by guilt. He begins having nightmares. Nightmares in which “someone I know begins to shrink,” becoming so small “they slip through my fingers and disappear onto the floor.” Nightmares in which “there’s a large pool of blood on the floor” that moves as if it’s alive, nipping at his feet.

The book review is here.

Thursday, February 25, 2016

Empathy is a moral force

Jamil Zaki
FORTHCOMING in Gray, K. & Graham, J. (Eds.), The Atlas of Moral Psychology

Here is an excerpt:

More recently, however, a growing countercurrent has questioned the utility of empathy in driving moral action. This argument builds on the broader idea that emotions provide powerful but noisy inputs to people’s moral calculus (Haidt, 2001). Affective reactions often tempt people to make judgments that are logically and morally indefensible. Such emotional static famously includes moral dumbfounding, under which people’s experience of disgust causes them to judge others’ actions as wrong when they have no rational basis for doing so (Cushman, Young, & Hauser, 2006). Emotion drives other irrational moral judgments, such as people’s tendency to privilege physical force (a “hot” factor) over more important dimensions such as harm when judging the moral status of an action (Greene, 2014; Greene et al., 2009). Even incidental, morally irrelevant feelings alter moral judgment, further damaging the credibility of emotion in guiding a sense of right and wrong. (Wheatley & Haidt, 2005).

In sum, although emotions play a powerful role in moral judgment, they need not play a useful role. Instead, capricious emotion-driven intuitions often attract people towards internally inconsistent and wrong-headed judgments. From a utilitarian perspective aimed at maximizing well being, these biases render emotion a fundamentally mistaken moral engine (cf. Greene, 2014).

Does this criticism apply to empathy? In many ways, it does. Like other affective states, empathy arises in response to evocative experiences, often in noisy ways that hamper objectivity. For instance, people experience more empathy, and thus moral obligation to help, in response to the visible suffering of others, as in the case of Baby Jessica described above. This empathy leads people to donate huge sums of money to help individuals whose stories they read about or see on television, while ignoring widespread misery that they could more efficaciously relieve (Genevsky, Västfjäll,
Slovic, & Knutson, 2013; Slovic, 2007; Small & Loewenstein, 2003). Empathy also collapses reliably when sufferers and would-be empathizers differ along dimensions of race, politics, age, or even meaningless de novo group assignments (Cikara, Bruneau, & Saxe, 2011; Zaki & Cikara, in press).

The chapter is here.

Wednesday, January 27, 2016

A cultural look at moral purity: wiping the face clean

Lee SWS, Tang H, Wan J, Mai X and Liu C
Front. Psychol. (2015) 6:577.
doi: 10.3389/fpsyg.2015.00577

Abstract

Morality is associated with bodily purity in the custom of many societies. Does that imply moral purity is a universal psychological phenomenon? Empirically, it has never been examined, as all prior experimental data came from Western samples. Theoretically, we suggest the answer is not so straightforward—it depends on the kind of universality under consideration. Combining perspectives from cultural psychology and embodiment, we predict a culture-specific form of moral purification. Specifically, given East Asians' emphasis on the face as a representation of public self-image, we hypothesize that facial purification should have particularly potent moral effects in a face culture. Data show that face-cleaning (but not hands-cleaning) reduces guilt and regret most effectively against a salient East Asian cultural background. It frees East Asians from guilt-driven prosocial behavior. In the wake of their immorality, they find a face-cleaning product especially appealing and spontaneously choose to wipe their face clean. These patterns highlight both culturally variable and universal aspects of moral purification. They further suggest an organizing principle that informs the vigorous debate between embodied and amodal perspectives.

The article is here.

Wednesday, November 25, 2015

The ‘blame and shame society’

Jean Knox
Psychoanalytic Psychotherapy
Volume 28, Issue 3, 2014

Abstract

In this opinion piece, I explore some of the social and cultural factors that contribute to the creation of feelings of shame in those members of society who are vulnerable or disadvantaged in various ways. I suggest that a ‘blame and shame’ attitude has become pervasive in today's political culture, reassuring the comfortable and privileged that they deserve their own success and allowing them to blame the disadvantaged for their own misfortune. Those who feel that they must become invulnerable in order to succeed therefore project their own vulnerable child onto the vulnerable in our society and attack and condemn in others what they most fear in themselves.

Introduction

One of the most intractable problems all therapists encounter is shame – the persistent negative sense of self that is evident when patients persist in describing themselves as disgusting, bad, dirty and all the other words of self-loathing which reflect a deeply painful self-hatred that the person clings to in spite of all attempts to shift it. These feelings are often accompanied by self-harm of various kinds – repeated cutting or overdosing, alcohol or drug abuse, eating disorders and by difficulty in affect regulation, mentalisation, attachment and sexuality.

An understanding of the unique personal relationships that have contributed to this kind of self-disgust and shame is vital if psychotherapists are to help their patients as effectively as possible. Herman (1992) first identified this as one key part of complex PTSD, suggesting that it arises from chronic developmental trauma.

The entire article is here.

Monday, August 3, 2015

Empathy Is Actually a Choice

By Daryl Cameron, Michael Inzlicht and William A. Cunningham
The New York Times - Gray Matter
Originally published July 10, 2015

ONE death is a tragedy. One million is a statistic.

You’ve probably heard this saying before. It is thought to capture an unfortunate truth about empathy: While a single crying child or injured puppy tugs at our heartstrings, large numbers of suffering people, as in epidemics, earthquakes and genocides, do not inspire a comparable reaction.

Studies have repeatedly confirmed this. It’s a troubling finding because, as recent research has demonstrated, many of us believe that if more lives are at stake, we will — and should — feel more empathy (i.e., vicariously share others’ experiences) and do more to help.

Not only does empathy seem to fail when it is needed most, but it also appears to play favorites. Recent studies have shown that our empathy is dampened or constrained when it comes to people of different races, nationalities or creeds. These results suggest that empathy is a limited resource, like a fossil fuel, which we cannot extend indefinitely or to everyone.

The entire article is here.

Monday, July 13, 2015

Chimpanzees can tell right from wrong

By Richard Gray
Daily Mail Online
Originally published June 26, 2015

They are our closest relatives in the animal kingdom, capable of using tools and solving problems much like their human cousins, but it appears chimpanzees also share our sense of morality too.
A new study of the apes reacting to an infant chimp being killed by another group has shown the animals have a strong sense of right and wrong.

The researchers found chimpanzees reacted to videos showing the violent scenes in a similar way to humans.

The entire article is here.

Friday, April 17, 2015

We all feel disgust but why do some of us turn it on ourselves?

By Jane Simpson and Phillip Powell
The Conversation
Originally posted March 27, 2015

Here is an excerpt:

Self-disgust differs from other negative feelings that people have about themselves in a number of ways. While self-disgust is likely to happen alongside other self-directed issues such as shame, unique features include feelings of revulsion, for example when looking in the mirror, contamination and magical rather than reasoned thinking. These, taken with other characteristics, such as its particular cognitive-affective content, suggest an emotional experience that is different to shame (related to hierarchical submission and diminished social rank).

Disgust is not about just “not liking” aspects of yourself – the depth of the emotion can mean you can’t even look at yourself without being overwhelmed with revulsion. The feeling that you are disgusting also means that you are potentially toxic to others – so people can become isolated as they do not wish to “infect” and “contaminate” others with their own perceived “disgustingness”.

The entire post is here.

Editor's Note: This article pertains to psychotherapy with trauma, personality disorders, and eating disorders.

Wednesday, February 18, 2015

Moral Judgment as a Natural Kind

By Victor Kumar
Forthcoming in Philosophical Studies

Moral judgments seem to be different from other normative judgments, even apart from their characteristic subject matter. Two people might both disapprove of an action, for example, although one judges it a moral violation and the other a breach of etiquette. Philosophers have traditionally attempted to define moral judgment through reflection alone. However, psychological research on the “moral/conventional distinction” offers a promising source of empirical evidence about the distinctive nature of moral judgment.

Several authors treat the ability to draw a distinction between morality and convention as a test for the presence of moral judgments (Blair 1995; Nichols 2004a; Prinz 2007; Levy 2007). None, however, develops the implied theory of moral judgment.

The entire article is here.

Thursday, February 12, 2015

Dimensions of Moral Emotions

By Kurt Gray and Daniel M. Wegner
Emotion Review Vol. 3, No. 3 (July 2011) 258–260

Abstract

Anger, disgust, elevation, sympathy, relief. If the subjective experience of each of these emotions is the same whether elicited by moral or nonmoral events, then what makes moral emotions unique? We suggest that the configuration of moral emotions is special—a configuration given by the underlying structure of morality. Research suggests that people divide the moral world along the two dimensions of valence (help/harm) and moral type (agent/patient). The intersection of these two dimensions gives four moral exemplars—heroes, villains, victims and beneficiaries—each of which elicits unique emotions. For example, victims (harm/patient) elicit sympathy and sadness. Dividing moral emotions into these four quadrants provides predictions about which emotions reinforce, oppose and complement each other.

The entire article is here.

Monday, February 9, 2015

Emotion and Morality: A Tasting Menu

By Joshua Greene
Emotion Review Vol. 3, No. 3 (2011) 1–3

In recent years, moral psychology has undergone a renaissance characterized by two dramatic changes (Haidt, 2007). First, the scientific study of morality has become a broad, interdisciplinary
enterprise, drawing on insights and methods from philosophy, neuroscience, economics, anthropology, biology, and all quarters of psychology. Second, emotion now plays a central role in moral psychology research. This special section on Emotion and Morality is a testament to the ingenuity, openmindedness, and energy that has infused this field.

The entire article is here.

Sunday, January 11, 2015

Why I am not Charlie

By Scott Long
A Paper Bird Blog
Originally posted January 9, 2015

Here is an excerpt:

It’s true, as Salman Rushdie says, that “Nobody has the right to not be offended.” You should not get to invoke the law to censor or shut down speech just because it insults you or strikes at your pet convictions. You certainly don’t get to kill because you heard something you don’t like. Yet, manhandled by these moments of mass outrage, this truism also morphs into a different kind of claim: That nobody has the right to be offended at all.

I am offended when those already oppressed in a society are deliberately insulted. I don’t want to participate. This crime in Paris does not suspend my political or ethical judgment, or persuade me that scatologically smearing a marginal minority’s identity and beliefs is a reasonable thing to do. Yet this means rejecting the only authorized reaction to the atrocity. Oddly, this peer pressure seems to gear up exclusively where Islam’s involved. When a racist bombed a chapter of a US civil rights organization this week, the media didn’t insist I give to the NAACP in solidarity. When a rabid Islamophobic rightist killed 77 Norwegians in 2011, most of them at a political party’s youth camp, I didn’t notice many #IAmNorway hashtags, or impassioned calls to join the Norwegian Labor Party. But Islam is there for us, it unites us against Islam. Only cowards or traitors turn down membership in the Charlie club.The demand to join, endorse, agree is all about crowding us into a herd where no one is permitted to cavil or condemn: an indifferent mob, where differing from one another is Thoughtcrime, while indifference to the pain of others beyond the pale is compulsory.

The entire blog post is here.

Editor's note: This is a long and interesting piece on emotional reactions to tragic and traumatic events.

Wednesday, December 3, 2014

Moral Psychology as Accountability

By Brendan Dill and Stephen Darwall
[In Justin D’Arms & Daniel Jacobson (eds.),  Moral Psychology and Human Agency:  Philosophical Essays on the Science of Ethics  (pp. 40-83). Oxford University Press. Pre-publication draft. For citation or quotation, please refer to the published volume.

Introduction

When moral psychology exploded a decade ago with groundbreaking research, there was considerable excitement about the potential fruits of collaboration between moral philosophers and moral psychologists. However, this enthusiasm soon gave way to controversy about whether either field was, or even could be, relevant to the other (e.g., Greene 2007; Berker 2009). After all, it seems at first glance that the primary question researched by moral psychologists—how people form judgments about what is morally right and wrong—is independent from the parallel question investigated by moral  philosophers—what is in fact morally right and wrong, and why.

Once we transcend the narrow bounds of quandary ethics and “trolleyology,” however, a broader look at the fields of moral psychology and moral philosophy reveals several common interests. Moral philosophers strive not only to determine what actions are morally right and wrong, but also to understand our moral concepts, practices, and  psychology. They ask what it means to be morally right, wrong, or obligatory: what distinguishes moral principles from other norms of action, such as those of instrumental rationality, prudence, excellence, or etiquette (Anscombe 1958; Williams 1985; Gibbard 1990; Annas 1995)? Moral psychologists pursue this very question in research on the distinction between moral and conventional rules (Turiel 1983; Nichols 2002; Kelly et al. 2007; Royzman, Leeman, and Baron 2009) and in attempts to define the moral domain (e.g., Haidt and Kesebir 2010).

The entire paper is here.