Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Emotion. Show all posts
Showing posts with label Emotion. Show all posts

Tuesday, September 26, 2017

The Influence of War on Moral Judgments about Harm

Hanne M Watkins and Simon M Laham
Preprint

Abstract

How does war influence moral judgments about harm? While the general rule is “thou shalt not kill,” war appears to provide an unfortunately common exception to the moral prohibition on intentional harm. In three studies (N = 263, N = 557, N = 793), we quantify the difference in moral judgments across peace and war contexts, and explore two possible explanations for the difference. Taken together, the findings of the present studies have implications for moral psychology researchers who use war based scenarios to study broader cognitive or affective processes. If the war context changes judgments of moral scenarios by triggering group-based reasoning or altering the perceived structure of the moral event, using such scenarios to make “decontextualized” claims about moral judgment may not be warranted.

Here is part of the discussion.

A number of researchers have begun to investigate how social contexts may influence moral judgment, whether those social contexts are grounded in groups (Carnes et al, 2015; Ellemers & van den Bos, 2009) or relationships (Fiske & Rai, 2014; Simpson, Laham, & Fiske, 2015). The war context is another specific context which influences moral judgments: in the present study we found that the intergroup nature of war influenced people’s moral judgments about harm in war – even if they belonged to neither of the two groups actually at war – and that the usually robust difference between switch and footbridge scenarios was attenuated in the war context. One implication of these findings is that some caution may be warranted when using war-based scenarios for studying morality in general. As mentioned in the introduction, scenarios set in war are often used in the study of broad domains or general processes of judgment (e.g. Graham et al., 2009; Phillips & Young, 2011; Piazza et al., 2013). Given the interaction of war context with intergroup considerations and with the construed structure of the moral event in the present studies, researchers are well advised to avoid making generalizations to morality writ large on the basis of war-related scenarios (see also Bauman, McGraw, Bartels, & Warren, 2014; Bloom, 2011).

The preprint is here.

Friday, July 21, 2017

Judgment Before Emotion: People Access Moral Evaluations Faster than Affective States

Corey Cusimano, Stuti Thapa Magar, & Bertram F. Malle

Abstract

Theories about the role of emotions in moral cognition make different predictions about the relative speed of moral and affective judgments: those that argue that felt emotions are causal inputs to moral judgments predict that recognition of affective states should precede moral judgments; theories that posit emotional states as the output of moral judgment predict the opposite. Across four studies, using a speeded reaction time task, we found that self-reports of felt emotion were delayed relative to reports of event-directed moral judgments (e.g. badness) and were no faster than person directed moral judgments (e.g. blame). These results pose a challenge to prominent theories arguing that moral judgments are made on the basis of reflecting on affective states.

The article is here.

Tuesday, April 25, 2017

Can Robots Be Ethical?

Robert Newman
Philosophy Now
Apr/May 2017 Issue 119

Here is an excerpt:

Delegating ethics to robots is unethical not just because robots do binary code, not ethics, but also because no program could ever process the incalculable contingencies, shifting subtleties, and complexities entailed in even the simplest case to be put before a judge and jury. And yet the law is another candidate for outsourcing, to ‘ethical’ robot lawyers. Last year, during a BBC Radio 4 puff-piece on the wonders of robotics, a senior IBM executive explained that while robots can’t do the fiddly manual jobs of gardeners or janitors, they can easily do all that lawyers do, and will soon make human lawyers redundant. However, when IBM Vice President Bob Moffat was himself on trial in the Manhattan Federal Court, accused of the largest hedge fund insider-trading in history, he inexplicably reposed all his hopes in one of those old-time human defence attorneys. A robot lawyer may have saved him from being found guilty of two counts of conspiracy and fraud, but when push came to shove, the IBM VP knew as well as the rest of us that the phrase ‘ethical robots’ is a contradiction in terms.

The article is here.

Friday, March 31, 2017

Signaling Emotion and Reason in Cooperation

Levine, Emma Edelman and Barasch, Alixandra and Rand, David G. and Berman, Jonathan Z. and Small, Deborah A. (February 23, 2017).

Abstract

We explore the signal value of emotion and reason in human cooperation. Across four experiments utilizing dyadic prisoner dilemma games, we establish three central results. First, individuals believe that a reliance on emotion signals that one will cooperate more so than a reliance on reason. Second, these beliefs are generally accurate — those who act based on emotion are more likely to cooperate than those who act based on reason. Third, individuals’ behavioral responses towards signals of emotion and reason depends on their own decision mode: those who rely on emotion tend to conditionally cooperate (that is, cooperate only when they believe that their partner has cooperated), whereas those who rely on reason tend to defect regardless of their partner’s signal. These findings shed light on how different decision processes, and lay theories about decision processes, facilitate and impede cooperation.

Available at SSRN: https://ssrn.com/abstract=2922765

Editor's note: This research has implications for developing the therapeutic relationship.

Sunday, March 26, 2017

Moral Enhancement Using Non-invasive Brain Stimulation

R. Ryan Darby and Alvaro Pascual-Leone
Front. Hum. Neurosci., 22 February 2017
https://doi.org/10.3389/fnhum.2017.00077

Biomedical enhancement refers to the use of biomedical interventions to improve capacities beyond normal, rather than to treat deficiencies due to diseases. Enhancement can target physical or cognitive capacities, but also complex human behaviors such as morality. However, the complexity of normal moral behavior makes it unlikely that morality is a single capacity that can be deficient or enhanced. Instead, our central hypothesis will be that moral behavior results from multiple, interacting cognitive-affective networks in the brain. First, we will test this hypothesis by reviewing evidence for modulation of moral behavior using non-invasive brain stimulation. Next, we will discuss how this evidence affects ethical issues related to the use of moral enhancement. We end with the conclusion that while brain stimulation has the potential to alter moral behavior, such alteration is unlikely to improve moral behavior in all situations, and may even lead to less morally desirable behavior in some instances.

The article is here.

Saturday, January 21, 2017

Elevation: A review of scholarship on a moral and other-praising emotion

Andrew L. Thomson and Jason T. Siegel
The Journal Of Positive Psychology 

Abstract

The term elevation (also referred to as moral elevation), described by Thomas Jefferson and later coined by Jonathan Haidt, refers to the suite of feelings people may experience when witnessing an instance of moral beauty. The construct of elevation signifies the emotion felt when a person is a witness to, but not a recipient of, the moral behavior of others. Scholarship examining elevation has burgeoned since Haidt first introduced the construct. Researchers have explored the antecedents of, and outcomes associated with, witnessing instances of moral beauty. The current review will outline the existing scholarship on elevation, highlight conflicting findings, point out critical gaps in the current state of elevation research, and delineate fertile future directions for basic and applied research. Continued investigation of the affective, motivational, and behavioral responses associated with witnessing virtuous actions of others is warranted.

The research is here.

Tuesday, December 20, 2016

The Role of Emotional Intuitions in Moral Judgments and Decisions

Gee, Catherine. 2014.
Journal of Cognition and Neuroethics 2 (1): 161–171.

Abstract

Joshua D. Greene asserts in his 2007 article “The Secret Joke of Kant’s Soul” that consequentialism is the superior moral theory compared to deontology due to its judgments arising from “cognitive” processes alone without (or very little) input from emotive processes. However, I disagree with Greene’s position and instead argue it is the combination of rational and emotive cognitive processes that are the key to forming a moral judgment. Studies on patients who suffered damage to their ventromedial prefrontal cortex will be discussed as they are real-life examples of individuals who, due to brain damage, make moral judgments based predominately on “cognitive” processes. These examples will demonstrate that the results of isolated “cognitive” mental processing are hardly what Greene envisioned. Instead of superior processing and judgments, these individuals show significant impairment. As such, Greene’s account ought to be dismissed for does not stand up to philosophical scrutiny or the psychological literature on this topic.

The article is here.

Thursday, November 17, 2016

Can Machines Become Moral?

Don Howard
Big Questions Online
Originally published October 23, 2016

Here is an excerpt:

There is an important lesson here, which applies with equal force to the claim that robots cannot comprehend emotion. It is that what can or cannot be done in the domain of artificial intelligence is always an empirical question, the answer to which will have to await the results of further research and development. Confident a priori assertions about what science and engineering cannot achieve have a history of turning out to be wrong, as with Auguste Comte’s bold claim in the 1830s that science could never reveal the internal chemical constitution of the sun and other heavenly bodies, a claim he made at just the time when scientists like Fraunhofer, Foucault, Kirchhoff, and Bunsen were pioneering the use of spectrographic analysis for precisely that task.

The article is here.

Wednesday, September 28, 2016

Psychopathy increases perceived moral permissibility of accidents

Young, Liane; Koenigs, Michael; Kruepke, Michael; Newman, Joseph P.
Journal of Abnormal Psychology, Vol 121(3), Aug 2012, 659-667.

Abstract

Psychopaths are notorious for their antisocial and immoral behavior, yet experimental studies have typically failed to identify deficits in their capacities for explicit moral judgment. We tested 20 criminal psychopaths and 25 criminal nonpsychopaths on a moral judgment task featuring hypothetical scenarios that systematically varied an actor's intention and the action's outcome. Participants were instructed to evaluate four classes of actions: accidental harms, attempted harms, intentional harms, and neutral acts. Psychopaths showed a selective difference, compared with nonpsychopaths, in judging accidents, where one person harmed another unintentionally. Specifically, psychopaths judged these actions to be more morally permissible. We suggest that this pattern reflects psychopaths' failure to appreciate the emotional aspect of the victim's experience of harm. These findings provide direct evidence of abnormal moral judgment in psychopathy.

The article is here.

Saturday, September 10, 2016

Rational and Emotional Sources of Moral Decision-Making: an Evolutionary-Developmental Account

Denton, K.K. & Krebs, D.L.
Evolutionary Psychological Science (2016). pp 1-14.

Abstract

Some scholars have contended that moral decision-making is primarily rational, mediated by controlled, deliberative, and reflective forms of moral reasoning. Others have contended that moral decision-making is primarily emotional, mediated by automatic, affective, and intuitive forms of decision-making. Evidence from several lines of research suggests that people make moral decisions in both of these ways. In this paper, we review psychological and neurological evidence supporting dual-process models of moral decision-making and discuss research that has attempted to identify triggers for rational-reflective and emotional-intuitive processes. We argue that attending to the ways in which brain mechanisms evolved and develop throughout the life span supplies a basis for explaining why people possess the capacity to engage in two forms of moral decision-making, as well as accounting for the attributes that define each type and predicting when the mental mechanisms that mediate each of them will be activated and when one will override the other. We close by acknowledging that neurological research on moral decision-making mechanisms is in its infancy and suggesting that future research should be directed at distinguishing among different types of emotional, intuitive, rational, and reflective processes; refining our knowledge of the brain mechanisms implicated in different forms of moral judgment; and investigating the ways in which these mechanisms interact to produce moral decisions.

The article is here.

Sunday, August 28, 2016

What Is Happening to Our Country? How Psychology Can Respond to Political Polarization, Incivility and Intolerance



As political events in Europe and America got stranger and more violent over the last year, I found myself thinking of the phrase “things fall apart; the center cannot hold.” I didn’t know its origin so I looked it up, found the poem The Second Coming, by W. B. Yeats, and found a great deal of wisdom. Yeats wrote it in 1919, just after the First World War and at the beginning of the Irish War of Independence.

The entire web page is here.

Sunday, May 8, 2016

Neuroscience is changing the debate over what role age should play in the courts

By Tim Requarth
Newsweeek
Originally posted April 18, 2016

Here is an excerpt:

The Supreme Court has increasingly called upon new findings in neuroscience and psychology in a series of rulings over the past decade (Roper v. Simmons, Graham v. Florida, Miller v. Alabama and Montgomery v. Louisiana) that prohibited harsh punishments—such as the death penalty and mandatory life without parole—for offenders under 18. Due to their immaturity, the argument goes, they are less culpable and so deserve less punishment than those 18 or older. In addition, because their wrongdoing is often the product of immaturity, younger criminals may have a greater potential for reform. Now people are questioning whether the age of 18 has any scientific meaning.

“People are not magically different on their 18th birthday,” says Elizabeth Scott, a professor of law at Columbia University whose work was cited in the seminal Roper case. “Their brains are still maturing, and the criminal justice system should find a way to take that into account.”

The article is here.

Friday, April 29, 2016

No, You Can’t Feel Sorry for Everyone

BY Adam Waytz
Nautilus
Originally posted April 14, 2015

Here is an excerpt:

Morality can’t be everywhere at once—we humans have trouble extending equal compassion to foreign earthquake victims and hurricane victims in our own country. Our capacity to feel and act prosocially toward another person is finite. And one moral principle can constrain another. Even political liberals who prize universalism recoil when it distracts from a targeted focus on socially disadvantaged groups. Empathy draws our attention toward particular targets, and whether that target represents the underprivileged, blood relatives, refugees from a distant country, or players on a sports team, those targets obscure our attention from other equally (or more) deserving ones.

That means we need to abandon an idealized cultural sensitivity that gives all moral values equal importance. We must instead focus our limited moral resources on a few values, and make tough choices about which ones are more important than others. Collectively, we must decide that these actions affect human happiness more than those actions, and therefore the first set must be deemed more moral than the second set.

The article is here.

Friday, April 8, 2016

Why Therapist Should Talk Politics

By Richard Brouillettee
The New York Times
Originally published March 15, 2016

Here is an except:

Typically, therapists avoid discussing social and political issues in sessions. If the patient raises them, the therapist will direct the conversation toward a discussion of symptoms, coping skills, the relevant issues in a patient’s childhood and family life. But I am growing more and more convinced that this is inadequate. Psychotherapy, as a field, is not prepared to respond to the major social issues affecting our patients’ lives.

When people can’t live up to the increasingly taxing demands of the economy, they often blame themselves and then struggle to live with the guilt. You see this same tendency, of course, in a variety of contexts, from children of divorce who feel responsible for their parents’ separation to the “survivor guilt” of those who live through disasters. In situations that may seem impossible or unacceptable, guilt becomes a shield for the anger you otherwise would feel: The child may be angry with her parents for divorcing, the survivor may be angry with those who perished.

The article is here.

Saturday, January 9, 2016

Moral judgment as information processing: an integrative review

Steve Guglielmo
Front Psychol. 2015; 6: 1637.
Published online 2015 Oct 30. doi:  10.3389/fpsyg.2015.01637

Abstract

How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

The entire article is here.

Thursday, October 29, 2015

Choosing Empathy

A Conversation with Jamil Zaki
The Edge
Originally published October 19, 2015

Here are some excerpts:

The first narrative is that empathy is automatic. This goes all the way back to Adam Smith, who, to me, generated the first modern account of empathy in his beautiful book, The Theory of Moral Sentiments. Smith described what he called the "fellow-feeling," through which people take on each other's states—very similar to what I would call experience sharing.              

(cut)

That's one narrative, that empathy is automatic, and again, it’s compelling—backed by lots of evidence. But if you believe that empathy always occurs automatically, you run into a freight train of evidence to the contrary. As many of us know, there are lots of instances in which people could feel empathy, but don't. The prototype case here is intergroup settings. People who are divided by a war, or a political issue, or even a sports rivalry, often experience a collapse of their empathy. In many cases, these folks feel apathy for others on the other side of a group boundary. They fail to share, or think about, or feel concern for those other people's emotions.              

In other cases, it gets even worse: people feel overt antipathy towards others, for instance, taking pleasure when some misfortune befalls someone on the other side of a group boundary. What's interesting to me is that this occurs not only for group boundaries that are meaningful, like ethnicity or religion, but totally arbitrary groups. If I were to divide us into a red and blue team, without that taking on any more significance, you would be more likely to experience empathy for fellow red team members than for me (apparently I'm on team blue today).  

The entire post and video is here.

Saturday, October 3, 2015

Neural Foundation of Morality

Roland Zahn, Ricardo de Oliveira-Souza, & Jorge Moll
International Encyclopedia of the Social & Behavioral Sciences (Second Edition)
2015, Pages 606–618

Moral behavior is one of the most sophisticated human abilities. Many social species behave altruistically toward their kin, but humans are unique in their ability to serve complex and changing societal needs. Cognitive neuroscience has started to elucidate specific brain mechanisms underpinning moral behavior, emotion, and motivation, emphasizing that these ingredients are also germane to human biology, rather than pure societal artifacts. The brain is where psychosocial learning and biology meet to produce the rich individual variability in moral behavior. This article discusses how cognitive neuroscience improves the understanding of this variability and associated suffering in neuropsychiatric conditions.

The entire article is here.

Tuesday, June 23, 2015

Increased Grey Matter in Those With Higher Levels of Moral Reasoning

Neuroscience News
Originally published June 3, 2015

Research from Penn scientists and business scholars aims to link moral reasoning with brain architecture.

Individuals with a higher level of moral reasoning skills showed increased gray matter in the areas of the brain implicated in complex social behavior, decision making, and conflict processing as compared to subjects at a lower level of moral reasoning, according to new research from the Perelman School of Medicine and the Wharton School of the University of Pennsylvania in collaboration with a researcher from Charité Universitätsmediz in Berlin, Germany. The team studied students in the Masters of Business Administration (MBA) program at the Wharton School. The work is published in the June 3rd edition of the journal PLOS ONE.

The article is here.

Friday, May 8, 2015

TMS affects moral judgment, showing the role of DLPFC and TPJ in cognitive and emotional processing

Jeurissen D, Sack AT, Roebroeck A, Russ BE and Pascual-Leone A (2014) TMS affects moral judgment, showing the role of DLPFC and TPJ in cognitive and emotional processing.
Front. Neurosci. 8:18. doi: 10.3389/fnins.2014.00018

Decision-making involves a complex interplay of emotional responses and reasoning processes. In this study, we use TMS to explore the neurobiological substrates of moral decisions in humans. To examining the effects of TMS on the outcome of a moral-decision, we compare the decision outcome of moral-personal and moral-impersonal dilemmas to each other and examine the differential effects of applying TMS over the right DLPFC or right TPJ. In this comparison, we find that the TMS-induced disruption of the DLPFC during the decision process, affects the outcome of the moral-personal judgment, while TMS-induced disruption of TPJ affects only moral-impersonal conditions. In other words, we find a double-dissociation between DLPFC and TPJ in the outcome of a moral decision. Furthermore, we find that TMS-induced disruption of the DLPFC during non-moral, moral-impersonal, and moral-personal decisions lead to lower ratings of regret about the decision. Our results are in line with the dual-process theory and suggest a role for both the emotional response and cognitive reasoning process in moral judgment. Both the emotional and cognitive processes were shown to be involved in the decision outcome.

The entire article is here.

Tuesday, April 21, 2015

Being There: Heidegger on Why Our Presence Matters

By Lawrence Berger
The New York Times - Opinionator
Originally published March 30, 2015

Here is an excerpt:

It can be argued that cognitive scientists tend to ignore the importance of what many consider to be essential features of human existence, preferring to see us as information processors rather than full-blooded human beings immersed in worlds of significance. In general, their intent is to explain human activity and life as we experience it on the basis of physical and physiological processes, the implicit assumption being that this is the domain of what is ultimately real. Since virtually everything that matters to us as human beings can be traced back to life as it is experienced, such thinking is bound to be unsettling.

The entire article is here.