Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Moral Intuition. Show all posts
Showing posts with label Moral Intuition. Show all posts

Saturday, December 31, 2016

The Wright Show: Against Empathy

Robert Wright interviews Paul Bloom on his book "Against Empathy."
The Wright Show
Originally published December 6, 2016


Thursday, October 20, 2016

Cognitive biases can affect moral intuitions about cognitive enhancement

Lucius Caviola, Adriano Mannino, Julian Savulescu and Nadira Faulmüller
Frontiers in Systems Neuroscience. 2014; 8: 195.
Published online 2014 Oct 15.

Abstract

Research into cognitive biases that impair human judgment has mostly been applied to the area of economic decision-making. Ethical decision-making has been comparatively neglected. Since ethical decisions often involve very high individual as well as collective stakes, analyzing how cognitive biases affect them can be expected to yield important results. In this theoretical article, we consider the ethical debate about cognitive enhancement (CE) and suggest a number of cognitive biases that are likely to affect moral intuitions and judgments about CE: status quo bias, loss aversion, risk aversion, omission bias, scope insensitivity, nature bias, and optimistic bias. We find that there are more well-documented biases that are likely to cause irrational aversion to CE than biases in the opposite direction. This suggests that common attitudes about CE are predominantly negatively biased. Within this new perspective, we hope that subsequent research will be able to elaborate this hypothesis and develop effective de-biasing techniques that can help increase the rationality of the public CE debate and thus improve our ethical decision-making.

The article is here.

Tuesday, October 18, 2016

Folk Moral Relativism

Hagop Sarkissian, John Park, David Tien, Jennifer Wright & Joshua Knobe
Mind and Language 26 (4):482-505 (2011)

Abstract:

It has often been suggested that people's ordinary understanding of morality involves a belief in objective moral truths and a rejection of moral relativism. The results of six studies call this claim into question. Participants did offer apparently objectivist moral intuitions when considering individuals from their own culture, but they offered increasingly relativist intuitions considering individuals from increasingly different cultures or ways of life. The authors hypothesize that people do not have a fixed commitment to moral objectivism but instead tend to adopt different views depending on the degree to which they consider radically different perspectives on moral questions.

The article is here.

Sunday, August 28, 2016

What Is Happening to Our Country? How Psychology Can Respond to Political Polarization, Incivility and Intolerance



As political events in Europe and America got stranger and more violent over the last year, I found myself thinking of the phrase “things fall apart; the center cannot hold.” I didn’t know its origin so I looked it up, found the poem The Second Coming, by W. B. Yeats, and found a great deal of wisdom. Yeats wrote it in 1919, just after the First World War and at the beginning of the Irish War of Independence.

The entire web page is here.

Wednesday, June 22, 2016

Moral intuitions: Are philosophers experts?

Kevin Tobia, Wesley Buckwalter, and Stephen Stich
Philosophical Psychology, 26(5): 629-638.

Abstract

Recently psychologists and experimental philosophers have reported findings showing that in some cases ordinary people’s moral intuitions are affected by factors of dubious relevance to the truth of the content of the intuition. Some defend the use of intuition as evidence in ethics  by arguing that philosophers are the experts in this area, and philosophers’ moral intuitions are  both different from those of ordinary people and more reliable. We conducted two experiments indicating that philosophers and non-philosophers do indeed sometimes have different moral intuitions, but challenging the notion that philosophers have better or more reliable intuitions.

The article is here.

Sunday, May 22, 2016

Is Deontology a Moral Confabulation?

Emilian Mihailov
Neuroethics
April 2016, Volume 9, Issue 1, pp 1-13

Abstract

Joshua Greene has put forward the bold empirical hypothesis that deontology is a confabulation of moral emotions. Deontological philosophy does not stem from "true" moral reasoning, but from emotional reactions, backed up by post hoc rationalizations which play no role in generating the initial moral beliefs. In this paper, I will argue against the confabulation hypothesis. First, I will highlight several points in Greene’s discussion of confabulation, and identify two possible models. Then, I will argue that the evidence does not illustrate the relevant model of deontological confabulation. In fact, I will make the case that deontology is unlikely to be a confabulation because alarm-like emotions, which allegedly drive deontological theorizing, are resistant to be subject to confabulation. I will end by clarifying what kind of claims can the confabulation data support. The upshot of the final section is that confabulation data cannot be used to undermine deontological theory in itself, and ironically, if one commits to the claim that a deontological justification is a confabulation in a particular case, then the data suggests that in general deontology has a prima facie validity.

The article is here.

Friday, December 4, 2015

Folk
 Moral 
Relativism

Hagop Sarkissian, John Park, David Tien, Jennifer Wright & Joshua Knobe
Mind and Language 26 (4):482-505 (2011)

Abstract

It has often been suggested that people's ordinary understanding of morality involves a belief in objective moral truths and a rejection of moral relativism. The results of six studies call this claim into question. Participants did offer apparently objectivist moral intuitions when considering individuals from their own culture, but they offered increasingly relativist intuitions considering individuals from increasingly different cultures or ways of life. The authors hypothesize that people do not have a fixed commitment to moral objectivism but instead tend to adopt different views depending on the degree to which they consider radically different perspectives on moral questions.

The entire article is here.

Tuesday, November 3, 2015

The neuroscience of moral cognition: from dual processes to dynamic systems

Jay J Van Bavel, Oriel FeldmanHall, Peter Mende-Siedlecki
Current Opinion in Psychology
Volume 6, December 2015, Pages 167–172

Prominent theories of morality have integrated philosophy with psychology and biology. Although this approach has been highly generative, we argue that it does not fully capture the rich and dynamic nature of moral cognition. We review research from the dual-process tradition, in which moral intuitions are automatically elicited and reasoning is subsequently deployed to correct these initial intuitions. We then describe how the computations underlying moral cognition are diverse and widely distributed throughout the brain. Finally, we illustrate how social context modulates these computations, recruiting different systems for real (vs. hypothetical) moral judgments, examining the dynamic process by which moral judgments are updated. In sum, we advocate for a shift from dual-process to dynamic system models of moral cognition.

The entire article is here.

Sunday, September 27, 2015

Emotional and Utilitarian Appraisals of Moral Dilemmas Are Encoded in Separate Areas of the Brain

Cendri A. Hutcherson, Leila Montaser-Kouhsari, James Woodward, & Antonio Rangel
The Journal of Neuroscience, 9 September 2015, 35(36): 12593-12605
doi: 10.1523/JNEUROSCI.3402-14.2015

Abstract

Moral judgment often requires making difficult tradeoffs (e.g., is it appropriate to torture to save the lives of innocents at risk?). Previous research suggests that both emotional appraisals and more deliberative utilitarian appraisals influence such judgments and that these appraisals often conflict. However, it is unclear how these different types of appraisals are represented in the brain, or how they are integrated into an overall moral judgment. We addressed these questions using an fMRI paradigm in which human subjects provide separate emotional and utilitarian appraisals for different potential actions, and then make difficult moral judgments constructed from combinations of these actions. We found that anterior cingulate, insula, and superior temporal gyrus correlated with emotional appraisals, whereas temporoparietal junction and dorsomedial prefrontal cortex correlated with utilitarian appraisals. Overall moral value judgments were represented in an anterior portion of the ventromedial prefrontal cortex. Critically, the pattern of responses and functional interactions between these three sets of regions are consistent with a model in which emotional and utilitarian appraisals are computed independently and in parallel, and passed to the ventromedial prefrontal cortex where they are integrated into an overall moral value judgment.

Significance statement

Popular accounts of moral judgment often describe it as a battle for control between two systems, one intuitive and emotional, the other rational and utilitarian, engaged in winner-take-all inhibitory competition. Using a novel fMRI paradigm, we identified distinct neural signatures of emotional and utilitarian appraisals and used them to test different models of how they compete for the control of moral behavior. Importantly, we find little support for competitive inhibition accounts. Instead, moral judgments resembled the architecture of simple economic choices: distinct regions represented emotional and utilitarian appraisals independently and passed this information to the ventromedial prefrontal cortex for integration into an overall moral value signal.

The entire article is here.

Monday, August 24, 2015

Good Without Knowing it: Subtle Contextual Cues can Activate Moral Identity and Reshape Moral Intuition

Keith Leavitt, Lei Zhu, Karl Aquino
Journal of Business Ethics
July 2015  Date: 30 Jul 2015

Abstract

The role of moral intuition (i.e., a set of implicit processes which occur automatically and at the fringe of conscious awareness) has been increasingly implicated in business decisions and (un)ethical business behavior. But troublingly, because implicit processes often operate outside of conscious awareness, decision makers are generally unaware of their influence. We tested whether subtle contextual cues for identity can alter implicit beliefs. In two studies, we found that contextual cues which nonconsciously prime moral identity weaken the implicit association between the categories of “business” and “ethical,” an implicit association which has previously been linked to unethical decision making. Further, changes in this implicit association mediated the relationship between contextually primed moral identity and concern for external stakeholder groups, regardless of self-reported moral identity. Thus, our results show that subtle contextual cues can lead individuals to render more ethical judgments, by automatically restructuring moral intuition below the level of consciousness.

The entire article is here.

Thursday, August 20, 2015

life after faith

Richard Marshall interviews Philip Kitcher
3:AM Magazine
Originally published on August 2, 2015

Here is an excerpt:

Thought experiments work when, and only when, they call into action cognitive capacities that might reliably deliver the conclusions drawn. When the question posed is imprecise, your thought experiment is typically useless. But even more crucial is the fact that the stripped-down scenarios many philosophers love simply don’t mesh with our intellectual skills. The story rules out by fiat the kinds of reactions we naturally have in the situation described. Think of the trolley problem in which you are asked to decide whether to push the fat man off the bridge. If you imagine yourself – seriously imagine yourself – in the situation, you’d look around for alternatives, you’d consider talking to the fat man, volunteering to jump with him, etc. etc. None of that is allowed. So you’re offered a forced choice about which most people I know are profoundly uneasy. The “data” delivered are just the poor quality evidence any reputable investigator would worry about using. (I like Joshua Greene’s fundamental idea of investigating people’s reactions; but I do wish he’d present them with better questions.)

Philosophers love to appeal to their “intuitions” about these puzzle cases. They seem to think they have access to little nuggets of wisdom. We’d all be much better off if the phrase “My intuition is …” were replaced by “Given my evolved psychological adaptations and my distinctive enculturation, when faced by this perplexing scenario, I find myself, more or less tentatively, inclined to say …” Maybe there are occasions in which the cases bring out some previously unnoticed facet of the meaning of a word. But, for a pragmatist like me, the important issues concern the words we might deploy to achieve our purposes, rather than the language we actually use.

If the intuition-mongering were abandoned, would that be the end of philosophy? It would be the end of a certain style of philosophy – a style that has cut philosophy off, not only from the humanities but from every other branch of inquiry and culture. (In my view, most of current Anglophone philosophy is quite reasonably seen as an ingrown conversation pursued by very intelligent people with very strange interests.) But it would hardly stop the kinds of investigation that the giants of the past engaged in. In my view, we ought to replace the notion of analytic philosophy by that of synthetic philosophy. Philosophers ought to aspire to know lots of different things and to forge useful synthetic perspectives.

The entire interview is here.

Friday, June 12, 2015

Confirmation Bias and the Limits of Human Knowledge

By Peter Wehner
Commentary Magazine
Originally published May 27, 2015

Here is an excerpt:

Confirmation bias is something we can easily identify in others but find very difficult to detect in ourselves. (If you finish this piece thinking only of the blindness of those who disagree with you, you are proving my point.) And while some people are far more prone to it than others, it’s something none of us is fully free of. We all hold certain philosophical assumptions, whether we’re fully aware of them or not, and they create a prism through which we interpret events. Often those assumptions are not arrived at through empiricism; they are grounded in moral intuitions. And moral intuitions, while not sub-rational, are shaped by things other than facts and figures. “The heart has its reasons which reason itself does not know,” Pascal wrote. And often the heart is right.

Without such core intuitions, we could not hope to make sense of the world. But these intuitions do not stay broad and implicit: we use them to make concrete judgments in life. The consequences of those judgments offer real-world tests of our assumptions, and if we refuse to learn from the results then we have no hope of improving our judgments in the future.

The entire article is here.

Thursday, April 23, 2015

Moral foundations and political attitudes: The moderating role of political sophistication

By Patrizia Milesi
The International Journal of Psychology
Originally published February 26, 2015

Abstract

Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates.

The entire article is here.

Tuesday, February 24, 2015

The Importance of Moral Construal

Moral versus Non-Moral Construal Elicits Faster, More Extreme, Universal Evaluations of the Same Actions

By Jay J. Van Bavel, Dominic J. Packer, Ingrid J. Haas, and William A. Cunningham
PLoS ONE 7(11): e48693. doi:10.1371/journal.pone.0048693

Abstract

Over the past decade, intuitionist models of morality have challenged the view that moral reasoning is the sole or even primary means by which moral judgments are made. Rather, intuitionist models posit that certain situations automatically elicit moral intuitions, which guide moral judgments. We present three experiments showing that evaluations are also susceptible to the influence of moral versus non-moral construal. We had participants make moral evaluations (rating whether actions were morally good or bad) or non-moral evaluations (rating whether actions were pragmatically or hedonically good or bad) of a wide variety of actions. As predicted, moral evaluations were faster, more extreme, and more strongly associated with universal prescriptions—the belief that absolutely nobody or everybody should engage in an action—than non-moral (pragmatic or hedonic) evaluations of the same actions. Further, we show that people are capable of flexibly shifting from moral to non-moral evaluations on a trial-by-trial basis. Taken together, these experiments provide evidence that moral versus non-moral construal has an important influence on evaluation and suggests that effects of construal are highly flexible. We discuss the implications of these experiments for models of moral judgment and decision-making.

The entire article is here.

Saturday, May 3, 2014

Religiosity, Political Orientation, and Consequentialist Moral Thinking

By Jared Piazza and Paulo Sousa
Social Psychological and Personality Science April 2014 vol. 5 no. 3 334-342

Abstract

Three studies demonstrated that the moral judgments of religious individuals and political conservatives are highly insensitive to consequentialist (i.e., outcome-based) considerations. In Study 1, both religiosity and political conservatism predicted a resistance toward consequentialist thinking concerning a range of transgressive acts, independent of other relevant dispositional factors (e.g., disgust sensitivity). Study 2 ruled out differences in welfare sensitivity as an explanation for these findings. In Study 3, religiosity and political conservatism predicted a commitment to judging “harmless” taboo violations morally impermissible, rather than discretionary, despite the lack of negative consequences rising from the act. Furthermore, non-consequentialist thinking style was shown to mediate the relationship religiosity/conservatism had with impermissibility judgments, while intuitive thinking style did not. These data provide further evidence for the influence of religious and political commitments in motivating divergent moral judgments, while highlighting a new dispositional factor, non-consequentialist thinking style, as a mediator of these effects.

The entire article is here.

Sunday, March 16, 2014

The Failure of Social and Moral Intuitions

Edge Videos
HeadCon '13: Part IX
David Pizarro

Today I want to talk a little about our social and moral intuitions and I want to present a case that they're rapidly failing, more so than ever. Let me start with an example. Recently, I collaborated with economist Rob Frank, roboticist Cynthia Breazeal, and social psychologist David DeSteno. The experiment that we did was interested in looking at how we detect trustworthiness in others.

We had people interact—strangers interact in the lab—and we filmed them, and we got the cues that seemed to indicate that somebody's going to be either more cooperative or less cooperative. But the fun part of this study was that for the second part we got those cues and we programmed a robot—Nexi the robot, from the lab of Cynthia Breazeal at MIT—to emulate, in one condition, those non-verbal gestures. So what I'm talking about today is not about the results of that study, but rather what was interesting about looking at people interacting with the robot.



The entire page is here.