Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Moral Norms. Show all posts
Showing posts with label Moral Norms. Show all posts

Monday, March 14, 2022

Can you be too moral?


Tim Dean
TEDx Sydney

Interesting introduction to moral certainty and dichotomous thinking.

One of the biggest challenges of our time is not people without morals, according to philosopher Tim Dean. It is often those with unwavering moral convictions who are the most dangerous. Tim challenges us to change the way we think about morality, right and wrong, to be more adaptable in order to solve the new and emerging problems of our modern lives. Tim Dean is a Sydney-based philosopher and science writer. He is the author of How We Became Human, a book about how our evolved moral minds are out of step with the modern world. He has a Doctorate in philosophy from the University of New South Wales on the evolution of morality and has expertise in ethics, philosophy of biology and critical thinking. 

Monday, November 15, 2021

On Defining Moral Enhancement: A Clarificatory Taxonomy

Carl Jago
Journal of Experimental Social Psychology
Volume 95, July 2021, 104145

Abstract

In a series of studies, we ask whether and to what extent the base rate of a behavior influences associated moral judgment. Previous research aimed at answering different but related questions are suggestive of such an effect. However, these other investigations involve injunctive norms and special reference groups which are inappropriate for an examination of the effects of base rates per se. Across five studies, we find that, when properly isolated, base rates do indeed influence moral judgment, but they do so with only very small effect sizes. In another study, we test the possibility that the very limited influence of base rates on moral judgment could be a result of a general phenomenon such as the fundamental attribution error, which is not specific to moral judgment. The results suggest that moral judgment may be uniquely resilient to the influence of base rates. In a final pair of studies, we test secondary hypotheses that injunctive norms and special reference groups would inflate any influence on moral judgments relative to base rates alone. The results supported those hypotheses.

From the General Discussion

In multiple experiments aimed at examining the influence of base rates per se, we found that base rates do indeed influence judgments, but the size of the effect we observed was very small. We considered that, in
discovering moral judgments’ resilience to influence from base rates, we may have only rediscovered a general tendency, such as the fundamental attribution error, whereby people discount situational factors. If
so, this tendency would then also apply broadly to non-moral scenarios. We therefore conducted another study in which our experimental materials were modified so as to remove the moral components. We found a substantial base-rate effect on participants’ judgments of performance regarding non-moral behavior. This finding suggests that the resilience to base rates observed in the preceding studies is unlikely the result of amore general tendency, and may instead be unique to moral judgment.

The main reasons why we concluded that the results from the most closely related extant research could not answer the present research question were the involvement in those studies of injunctive norms and
special reference groups. To confirm that these factors could inflate any influence of base rates on moral judgment, in the final pair of studies, we modified our experiments so as to include them. Specifically, in one study, we crossed prescriptive and proscriptive injunctive norms with high and low base rates and found that the impact of an injunctive norm outweighs any impact of the base rate. In the other study, we found that simply mentioning, for example, that there were some good people among those who engaged in a high base-rate behavior resulted in a large effect on moral judgment; not only on judgments of the target’s character, but also on judgments of blame and wrongness. 

Monday, February 17, 2020

Religion’s Impact on Conceptions of the Moral Domain

S. Levine, and others
PsyArXiv Preprints
Last edited 2 Jan 20

Abstract

How does religious affiliation impact conceptions of the moral domain? Putting aside the question of whether people from different religions agree about how to answer moral questions, here we investigate a more fundamental question: How much disagreement is there across religions about which issues count as moral in the first place? That is, do people from different religions conceptualize the scope of morality differently? Using a new methodology to map out how individuals conceive of the moral domain, we find dramatic differences among adherents of different religions. Mormons and Muslims moralize their religious norms, while Jews do not. Hindus do not seem to make a moral/non-moral distinction at all. These results suggest that religious affiliation has a profound effect on conceptions of the scope of morality.

From the General Discussion:

The results of Study 3 and 3a are predicted by neither Social Domain Theory nor Moral Foundations Theory: It is neither true that secular people and religious people share a common conception of the moral domain (as Social Domain Theory argues), nor that religious morality is expanded beyond secular morality in a uniform manner (as Moral Foundations Theory suggests).When participants in a group did make a moral/non-moral distinction, there was broad agreement that norms related to harm, justice, and rights count as moral norms. However, some religious individuals (such as the Mormon and Muslim participants) also moralized norms from their own religion that are not related to these themes. Meanwhile, others (such as the Jewish participants) acknowledged the special status of their own norms but did not moralize them. Yet others (such as the Hindu participants) made no distinction between the moral and the non-moral. 

The research is here.

Sunday, April 22, 2018

What is the ethics of ageing?

Christopher Simon Wareham
Journal of Medical Ethics 2018;44:128-132.

Abstract

Applied ethics is home to numerous productive subfields such as procreative ethics, intergenerational ethics and environmental ethics. By contrast, there is far less ethical work on ageing, and there is no boundary work that attempts to set the scope for ‘ageing ethics’ or the ‘ethics of ageing’. Yet ageing is a fundamental aspect of life; arguably even more fundamental and ubiquitous than procreation. To remedy this situation, I examine conceptions of what the ethics of ageing might mean and argue that these conceptions fail to capture the requirements of the desired subfield. The key reasons for this are, first, that they view ageing as something that happens only when one is old, thereby ignoring the fact that ageing is a process to which we are all subject, and second that the ageing person is treated as an object in ethical discourse rather than as its subject. In response to these shortcomings I put forward a better conception, one which places the ageing person at the centre of ethical analysis, has relevance not just for the elderly and provides a rich yet workable scope. While clarifying and justifying the conceptual boundaries of the subfield, the proposed scope pleasingly broadens the ethics of ageing beyond common negative associations with ageing.

The article is here.

Sunday, October 8, 2017

Moral outrage in the digital age

Molly J. Crockett
Nature Human Behaviour (2017)
Originally posted September 18, 2017

Moral outrage is an ancient emotion that is now widespread on digital media and online social networks. How might these new technologies change the expression of moral outrage and its social consequences?

Moral outrage is a powerful emotion that motivates people to shame and punish wrongdoers. Moralistic punishment can be a force for good, increasing cooperation by holding bad actors accountable. But punishment also has a dark side — it can exacerbate social conflict by dehumanizing others and escalating into destructive feuds.

Moral outrage is at least as old as civilization itself, but civilization is rapidly changing in the face of new technologies. Worldwide, more than a billion people now spend at least an hour a day on social media, and moral outrage is all the rage online. In recent years, viral online shaming has cost companies millions, candidates elections, and individuals their careers overnight.

As digital media infiltrates our social lives, it is crucial that we understand how this technology might transform the expression of moral outrage and its social consequences. Here, I describe a simple psychological framework for tackling this question (Fig. 1). Moral outrage is triggered by stimuli that call attention to moral norm violations. These stimuli evoke a range of emotional and behavioural responses that vary in their costs and constraints. Finally, expressing outrage leads to a variety of personal and social outcomes. This framework reveals that digital media may exacerbate the expression of moral outrage by inflating its triggering stimuli, reducing some of its costs and amplifying many of its personal benefits.

The article is here.

Tuesday, January 31, 2017

Cognitive science suggests Trump makes us more accepting of the morally outrageous

Joshua Knobe
Vox.com
Updated January 10, 2017

Here is an excerpt:

At the core of this research is a very simple idea: When people are reasoning, they tend to think only about a relatively narrow range of possibilities. You are sitting there in a restaurant, trying to decide what to order. Almost immediately, you determine that you are going to get either the chocolate cake or the cheese plate. You then start to consider the merits and drawbacks of each option. "Should I get the chocolate cake? Nah, too many carbs. Better get the cheese plate." One important question about human cognition is how people end up choosing one option over the other in a case like this.

But there is another question here that is even more fundamental — so fundamental that it’s easy to overlook. How did you pick out those two options in the first place? After all, there’s an enormous range of other options that would, at least in principle, have been possible. You could have stormed into the kitchen and started eating directly out of the chef's saucepan. You could have reached under the table and started trying to eat your own shoe. Yet somehow you manage to reject all of these possibilities before the reasoning process even begins. It’s not as though you think, "Should I try to eat my shoe? No, it’s not very tasty, or even edible." Rather, possibilities like this one never even enter your reasoning at all.

This is where the notion of normality plays its most essential role. Of all the zillions of things that might be possible in principle, your mind is able to zero in on just a few specific possibilities, completely ignoring all the others. One aim of recent research has been to figure out how people do this. Though the research itself has been quite complex, the key conclusion is surprisingly straightforward: People show an impressive systematic tendency to completely ignore the possibilities they see as abnormal.

The article is here.

Tuesday, October 11, 2016

How US prisons violate three principles of criminal justice

Judith Lichtenberg
aeon.co
Originally published September 19, 2016

The United States has 5 per cent of the world’s population but 25 per cent of its prisoners. Right now, 2.2 million people are locked up across the country, and while crime has been decreasing since the 1990s, rates of imprisonment are at historic highs. Americans across the political spectrum are deeply dissatisfied with this state of affairs, and agree that mass incarceration costs too much and achieves too little. But there’s also much disagreement – about the role of systemic racism, about the causes of police violence, about the importance of personal responsibility and retribution.

Nevertheless, people can find common ground on three fundamental moral norms that should govern the use of imprisonment as punishment. First, punishments should be proportional to crimes. Second, like cases should be treated alike. Third, criminal punishment should not do more harm than good. Unfortunately, the US system violates each of these principles.

Proportionality requires that the punishment fit the crime. This is more than a mere cliché. It means punishments should be neither excessive nor insufficient. Imprisonment for a parking ticket would be wrong, but so would a slap on the wrist for rape.

Friday, September 30, 2016

Gender Differences in Responses to Moral Dilemmas: A Process Dissociation Analysis

Rebecca Friesdorf, Paul Conway, and Bertram Gawronski
Pers Soc Psychol Bull, first published on April 3, 2015
doi:10.1177/0146167215575731

Abstract

The principle of deontology states that the morality of an action depends on its consistency with moral norms; the principle of utilitarianism implies that the morality of an action depends on its consequences. Previous research suggests that deontological judgments are shaped by affective processes, whereas utilitarian judgments are guided by cognitive processes. The current research used process dissociation (PD) to independently assess deontological and utilitarian inclinations in women and men. A meta-analytic re-analysis of 40 studies with 6,100 participants indicated that men showed a stronger preference for utilitarian over deontological judgments than women when the two principles implied conflicting decisions (d = 0.52). PD further revealed that women exhibited stronger deontological inclinations than men (d = 0.57), while men exhibited only slightly stronger utilitarian inclinations than women (d = 0.10). The findings suggest that gender differences in moral dilemma judgments are due to differences in affective responses to harm rather than cognitive evaluations of outcomes.

The article is here.

Wednesday, March 25, 2015

Sacrifice One For the Good of Many? People Apply Different Moral Norms to Human and Robot Agents

By B.F. Malle, M. Scheutz, T. Arnold, J. Voiklis, and C. Cusimano
HRI '15, March 02 - 05 2015

Abstract

Moral norms play an essential role in regulating human interaction. With the growing sophistication and proliferation of robots, it is important to understand how ordinary people apply moral norms to robot agents and make moral judgments about their behavior. We report the first comparison of people’s moral judgments (of permissibility, wrongness, and blame) about human and robot agents. Two online experiments (total N = 316) found that robots, compared with human agents, were more strongly expected to take an action that sacrifices one person for the good of many (a “utilitarian” choice), and they were blamed more than their human counterparts when they did not make that choice.  Though the utilitarian sacrifice was generally seen as permissible for human agents, they were blamed more for choosing this option than for doing nothing. These results provide a first step toward a new field of Moral HRI, which is well placed to help guide the design of social robots.

The entire article is here.

Saturday, August 16, 2014

Are Human Rights Redundant in the Ethical Codes of Psychologists?

Alfred Allan
Ethics & Behavior
Volume 23, Issue 4, 2013
DOI:10.1080/10508422.2013.776480

The codes of ethics and conduct of a number of psychology bodies explicitly refer to human rights, and the American Psychological Association recently expanded the use of the construct when it amended standard 1.02 of the Ethical Principles of Psychologists and Code of Conduct. What is unclear is how these references to human rights should be interpreted. In this article I examine the historical development of human rights and associated constructs and the contemporary meaning of human rights. As human rights are generally associated with law, morality, or religion, I consider to which of forms of these references most likely refer. I conclude that these references in ethical codes are redundant and that it would be preferable not to refer to human rights in codes. Instead, the profession should acknowledge human rights as a separate and complimentary norm system that governs the behavior of psychologists and should ensure that they have adequate knowledge of human rights and encourage them to promote human rights.

The entire article is here.

Monday, July 21, 2014

'Bad' video game behavior increases players' moral sensitivity

By Pat Donovan
Medical Xpress
Originally published June 27, 2014

New evidence suggests heinous behavior played out in a virtual environment can lead to players' increased sensitivity toward the moral codes they violated.

(cut)

"Rather than leading players to become less moral," Grizzard says, "this research suggests that violent video-game play may actually lead to increased moral sensitivity. This may, as it does in real life, provoke players to engage in voluntary behavior that benefits others."

The entire article is here.

Friday, January 3, 2014

Ideology Is Heritable Yet Societies Can Change Their Views Quickly

By Jonathan Haidt
Social Evolution Forum
Originally published December 16, 2013

Here is an excerpt:

From my perspective as a social psychologist, who studies morality from an evolutionary perspective, rapid attitude change is not hard to explain. I am impressed by the consistent data on heritability, showing that some very important parts of our moral and political views are innate. But innate does not mean hard-wired or unmalleable; it  means “structured in advance of experience, and experience can edit and alter that first draft.” (That’s a paraphrase from Gary Marcus). So even if one is born predisposed to questioning authority and seeking out diversity, life experiences can still alter one’s habitual reactions. Becoming a parent, especially of girls, seems to make people more conservative (they perceive more threats in the world).

The entire article is here.

Tuesday, November 26, 2013

The Roots of Good and Evil: An Interview with Paul Bloom

By Sam Harris
Sam Harris Blog
Originally published November 12, 2013

Here is one excerpt:

Harris: What are the greatest misconceptions people have about the origins of morality?

Bloom: The most common misconception is that morality is a human invention. It’s like agriculture and writing, something that humans invented at some point in history. From this perspective, babies start off as entirely self-interested beings—little psychopaths—and only gradually come to appreciate, through exposure to parents and schools and church and television, moral notions such as the wrongness of harming another person.

Now, this perspective is not entirely wrong. Certainly some morality is learned; this has to be the case because moral ideals differ across societies. Nobody is born with the belief that sexism is wrong (a moral belief that you and I share) or that blasphemy should be punished by death (a moral belief that you and I reject). Such views are the product of culture and society. They aren’t in the genes.
But the argument I make in Just Babies is that there also exist hardwired moral universals—moral principles that we all possess. And even those aspects of morality—such as the evils of sexism—that vary across cultures are ultimately grounded in these moral foundations.

A very different misconception sometimes arises, often stemming from a religious or spiritual outlook. It’s that we start off as Noble Savages, as fundamentally good and moral beings. From this perspective, society and government and culture are corrupting influences, blotting out and overriding our natural and innate kindness.

This, too, is mistaken. We do have a moral core, but it is limited—Hobbes was closer to the truth than Rousseau. Relative to an adult, your typical toddler is selfish, parochial, and bigoted. I like the way Kingsley Amis once put it: “It was no wonder that people were so horrible when they started life as children.” Morality begins with the genes, but it doesn’t end there.

The entire interview is here.