Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Group Behaviors. Show all posts
Showing posts with label Group Behaviors. Show all posts

Friday, December 15, 2017

The Vortex

Oliver Burkeman
The Guardian
Originally posted November 30, 2017

Here is an excerpt:

I realise you don’t need me to tell you that something has gone badly wrong with how we discuss controversial topics online. Fake news is rampant; facts don’t seem to change the minds of those in thrall to falsehood; confirmation bias drives people to seek out only the information that bolsters their views, while dismissing whatever challenges them. (In the final three months of the 2016 presidential election campaign, according to one analysis by Buzzfeed, the top 20 fake stories were shared more online than the top 20 real ones: to a terrifying extent, news is now more fake than not.) Yet, to be honest, I’d always assumed that the problem rested solely on the shoulders of other, stupider, nastier people. If you’re not the kind of person who makes death threats, or uses misogynistic slurs, or thinks Hillary Clinton’s campaign manager ran a child sex ring from a Washington pizzeria – if you’re a basically decent and undeluded sort, in other words – it’s easy to assume you’re doing nothing wrong.

But this, I am reluctantly beginning to understand, is self-flattery. One important feature of being trapped in the Vortex, it turns out, is the way it looks like everyone else is trapped in the Vortex, enslaved by their anger and delusions, obsessed with point-scoring and insult-hurling instead of with establishing the facts – whereas you’re just speaking truth to power. Yet in reality, when it comes to the divisive, depressing, energy-sapping nightmare that is modern online political debate, it’s like the old line about road congestion: you’re not “stuck in traffic”. You are the traffic.

The article is here.

Tuesday, September 26, 2017

The Influence of War on Moral Judgments about Harm

Hanne M Watkins and Simon M Laham


How does war influence moral judgments about harm? While the general rule is “thou shalt not kill,” war appears to provide an unfortunately common exception to the moral prohibition on intentional harm. In three studies (N = 263, N = 557, N = 793), we quantify the difference in moral judgments across peace and war contexts, and explore two possible explanations for the difference. Taken together, the findings of the present studies have implications for moral psychology researchers who use war based scenarios to study broader cognitive or affective processes. If the war context changes judgments of moral scenarios by triggering group-based reasoning or altering the perceived structure of the moral event, using such scenarios to make “decontextualized” claims about moral judgment may not be warranted.

Here is part of the discussion.

A number of researchers have begun to investigate how social contexts may influence moral judgment, whether those social contexts are grounded in groups (Carnes et al, 2015; Ellemers & van den Bos, 2009) or relationships (Fiske & Rai, 2014; Simpson, Laham, & Fiske, 2015). The war context is another specific context which influences moral judgments: in the present study we found that the intergroup nature of war influenced people’s moral judgments about harm in war – even if they belonged to neither of the two groups actually at war – and that the usually robust difference between switch and footbridge scenarios was attenuated in the war context. One implication of these findings is that some caution may be warranted when using war-based scenarios for studying morality in general. As mentioned in the introduction, scenarios set in war are often used in the study of broad domains or general processes of judgment (e.g. Graham et al., 2009; Phillips & Young, 2011; Piazza et al., 2013). Given the interaction of war context with intergroup considerations and with the construed structure of the moral event in the present studies, researchers are well advised to avoid making generalizations to morality writ large on the basis of war-related scenarios (see also Bauman, McGraw, Bartels, & Warren, 2014; Bloom, 2011).

The preprint is here.

Friday, September 22, 2017

I Lie? We Lie! Why? Experimental Evidence on a Dishonesty Shift in Groups

Kocher, Martin G. and Schudy, Simeon and Spantig, Lisa
CESifo Working Paper Series No. 6008.


Unethical behavior such as dishonesty, cheating and corruption occurs frequently in organizations or groups. Recent experimental evidence suggests that there is a stronger inclination to behave immorally in groups than individually. We ask if this is the case, and if so, why. Using a parsimonious laboratory setup, we study how individual behavior changes when deciding as a group member. We observe a strong dishonesty shift. This shift is mainly driven by communication within groups and turns out to be independent of whether group members face payoff commonality or not (i.e., whether other group members benefit from one’s lie). Group members come up with and exchange more arguments for being dishonest than for complying with the norm of honesty. Thereby, group membership shifts the perception of the validity of the honesty norm and of its distribution in the population.

The article is here.

Wednesday, April 26, 2017

Moral judging helps people cooperate better in groups

Science Blog
Originally posted April 7, 2017

Here is an excerpt:

“Generally, people think of moral judgments negatively,” Willer said. “But they are a critical means for encouraging good behavior in society.”

Researchers also found that the groups who were allowed to make positive or negative judgments of each other were more trusting and generous toward each other.

In addition, the levels of cooperation in such groups were found to be comparable with groups where monetary punishments were used to promote collaboration within the group, according to the study, titled “The Enforcement of Moral Boundaries Promotes Cooperation and Prosocial Behavior in Groups.”

The power of social approval

The idea that moral judgments are fundamental to social order has been around since the late 19th century. But most existing research has looked at moral reasoning and judgments as an internal psychological process.

Few studies so far have examined how costless expressions of liking or disapproval can affect individual behavior in groups, and none of these studies investigated how moral judgments compare with monetary sanctions, which have been shown to lead to increased cooperation as well, Willer said.

The article is here.

Sunday, February 5, 2017

Group-focused morality is associated with limited conflict detection and resolution capacity: Neuroanatomical evidence

Nash, Kyle, Baumgartner, Thomas, & Knoch, Daria
Biological Psychology
Volume 123, February 2017, Pages 235–240


Group-focused moral foundations (GMFs) − moral values that help protect the group’s welfare − sharply divide conservatives from liberals and religiously devout from non-believers. However, there is little evidence about what drives this divide. Moral foundations theory and the model of motivated social cognition both associate group-focused moral foundations with differences in conflict detection and resolution capacity, but in opposing directions. Individual differences in conflict detection and resolution implicate specific neuroanatomical differences. Examining neuroanatomy thus affords an objective and non-biased opportunity to contrast these influential theories. Here, we report that increased adherence to group-focused moral foundations was strongly associated (whole-brain corrected) with reduced gray matter volume in key regions of the conflict detection and resolution system (anterior cingulate cortex and lateral prefrontal cortex). Because reduced gray matter is reliably associated with reduced neural and cognitive capacity, these findings support the idea outlined in the model of motivated social cognition that belief in group-focused moral values is associated with reduced conflict detection and resolution capacity.

The article is here.

Thursday, January 5, 2017

To Make a Team More Effective, Find Their Commonalities

David DeSteno
Harvard Business Review
December 12, 2016

Here is an excerpt:

When it comes to empathy and compassion, the most powerful tool is a sense of similarity – a belief that people’s interests are joined and, thus, that they’re all on the same team and will benefit from supporting each other. Consider an example from the first World War. British and German troops were fighting a long, bloody battle in the trenches outside of Ypres, Belgium. But on Christmas Eve, the British began to see their foes light candles and sing familiar carols. Soon, these men, who had previously been trying to kill each other, came out to greet one another, share stories and celebrate the holiday together. For a brief period, they re-categorized themselves as members of the same group, in this case defined by religion, and felt a new camaraderie.

You can achieve a similar effect by emphasizing or introducing even less significant similarities. For example, Claremont McKenna’s Piercarlo Valdesolo and I conducted an experiment in which we had participants tap their hands in synch — or not in synch — with another person, who was later unfairly stuck with an onerous assignment. Half of the people who had tapped in unison with their partners offered to help with the task, compared with only 18% of those who were out of synch. The in-synch tappers reported not only feeling more similar to the strangers with whom they’d been paired, but also more compassion for them, and those two measures increased in tandem.

The article is here.

Wednesday, December 7, 2016

Do conservatives value ‘moral purity’ more than liberals?

Kate Johnson and Joe Hoover
The Conversation
Originally posted November 21, 2016

Here is an excerpt:

Our results were remarkably consistent with our first study. When people thought the person they were being partnered with did not share their purity concerns, they tended to avoid them. And, when people thought their partner did share their purity concerns, they wanted to associate with them.

As on Twitter, people were much more likely to associate with the other person when they had similar response to the moral purity scenarios and to avoid them when they had dissimilar response. And this pattern of responding was much stronger for purity concerns than similarities or differences for any other moral concerns, regardless of people’s religious and political affiliation and the religious and political affiliation they attributed to their partner.

There are many examples of how moral purity concerns are woven deeply into the fabric of social life. For example, have you noticed that when we derogate another person or social group we often rely on adjectives like “dirty,” and “disgusting”? Whether we are talking about “dirty hippies” or an entire class of “untouchables” or “deplorables,” we tend to signal inferiority and separation through moral terms grounded in notions of bodily and spiritual purity.

The article is here.

Wednesday, November 30, 2016

Human brain is predisposed to negative stereotypes, new study suggests

Hannah Devlin
The Guardian
Originally posted November 1, 2016

The human brain is predisposed to learn negative stereotypes, according to research that offers clues as to how prejudice emerges and spreads through society.

The study found that the brain responds more strongly to information about groups who are portrayed unfavourably, adding weight to the view that the negative depiction of ethnic or religious minorities in the media can fuel racial bias.

Hugo Spiers, a neuroscientist at University College London, who led the research, said: “The newspapers are filled with ghastly things people do ... You’re getting all these news stories and the negative ones stand out. When you look at Islam, for example, there’s so many more negative stories than positive ones and that will build up over time.”

The article is here.

Monday, February 15, 2016

If You’re Loyal to a Group, Does It Compromise Your Ethics?

By Francesca Gino
Harvard Business Review
Originally posted January 06, 2016

Here are two excerpts:

Most of us feel loyalty, whether to our clan, our comrades, an organization, or a cause. These loyalties are often important aspects of our social identity. Once a necessity for survival and propagation of the species, loyalty to one’s in-group is deeply rooted in human evolution.

But the incidents of wrongdoing that capture the headlines make it seem like loyalty is all too often a bad thing, corrupting many aspects of our personal and professional lives. My recent research, conducted in collaboration with Angus Hildreth of the University of California, Berkeley and Max Bazerman of Harvard Business School, suggests that this concern about loyalty is largely misplaced. In fact, we found loyalty to a group can increase, rather than decrease, honest behavior.


As our research shows, loyalty can be a driver of good behavior, but when competition among groups is high, it can lead us to behave unethically. When we are part of a group of loyal members, traits associated with loyalty — such as honor, honesty, and integrity — are very salient in our minds. But when loyalty seems to demand a different type of goal, such as competing with other groups and winning at any cost, behaving ethically becomes a less important goal.

The article is here.

Wednesday, October 29, 2014

Cooperation shapes abilities of the human brain

Swiss National Science Foundation
Originally published August 30, 2014

Here is an excerpt:

For several decades many characteristics originally classed as being specific to humans have been seen in a new light. This exclusive interpretation has given way to the view that our ability to plan and remember does not differentiate us from other great apes. In fact, the opposite is true. These cognitive abilities, along with our use of tools, link us to our closest biological relatives. And yet there is a substantial difference to which reference is frequently made when it comes to explaining the unique nature of humans’ cognitive and cultural skills.

The entire pressor is here.

Monday, August 18, 2014

5 Reasons Ethical Culture Doesn’t Just Happen

By Linda Fisher Thornton
Leading in Context
Originally posted August 6, 2014

Don’t assume that an ethical culture will just happen in your workplace. Even if you are a good leader, ethical culture is a delicate thing, requiring intentional positive leadership and daily tending. It requires more than good leadership, more than trust building, and more than good hiring.

Why does building an ethical culture require so much more than good leadership? Ethical culture is a system of systems, and just putting in good leadership, trust-building and good hiring doesn’t make it healthy.

The entire blog post is here.

Tuesday, July 1, 2014

When good people do bad things

Being in a group makes some people lose touch with their personal moral beliefs, researchers find

By Anne Trafton
MIT News
Originally posted June 12, 2014

When people get together in groups, unusual things can happen — both good and bad. Groups create important social institutions that an individual could not achieve alone, but there can be a darker side to such alliances: Belonging to a group makes people more likely to harm others outside the group.

“Although humans exhibit strong preferences for equity and moral prohibitions against harm in many contexts, people’s priorities change when there is an ‘us’ and a ‘them,’” says Rebecca Saxe, an associate professor of cognitive neuroscience at MIT. “A group of people will often engage in actions that are contrary to the private moral standards of each individual in that group, sweeping otherwise decent individuals into ‘mobs’ that commit looting, vandalism, even physical brutality.”

Several factors play into this transformation. When people are in a group, they feel more anonymous, and less likely to be caught doing something wrong. They may also feel a diminished sense of personal responsibility for collective actions.

The entire article is here.

Sunday, January 12, 2014

Collaboration Can Breed Overconfidence

Minds for Business
Psychological Science
Originally published November 20, 2013

The researchers found that people working with a partner were more confident in their estimates and significantly less willing to take outside advice. The pairs’ guesses were marginally more accurate than those of the individuals at first.

But after revision (or lack thereof), that difference was gone. Even the combined judgments of four people yielded no better results than those of two or three. Finally, the researchers found that had the pairs yielded to outside input, their estimates would have been significantly more accurate. Their confidence was costly.

The entire article is here.

Sunday, June 9, 2013

Studying Childhood Morality via Social Groups and Cognition

Rhodes, M. (in press). Naive theories of social groups. Child Development.

Here are some excerpts from this paper regarding the importance of studying moral development.

Yet, despite preschoolers’ general commitment to fairness, the possibility that children view people as having special moral obligations to their own group members cannot be entirely ruled out. This possibility is consistent with several theoretical accounts of morality proposed by social and cultural psychologists (Cohen, Montoya, & Insko, 2006; Dovidio, 1984; Haidt & Joseph, 2007; Haidt & Kesebir, 2010; Levine, Cassidy, Brazier, & Reicher, 2002; Levine & Thompson, 2004), and there is recent developmental data that appear consistent with this possibility (Castelli, De Amicis, & Sherman, 2007; Rhodes & Brickman, 2011). Thus, this remains an important area for future work.


Whereas the majority of research in this area has examined how children appeal to individual mental states to make these predictions, there has recently been increasing emphasis on understanding how children make these predictions by reference to social causes that extend beyond the individual, including social categories, norms, and morality (Hirschfeld, 1996; Olson & Dweck, 2008; Wellman & Miller, 2008). This emphasis—on considering children’s naı¨ve sociology along with their naive psychology—is particularly important given that preschool-age children often weight the causal features  specified by naive sociology (e.g., categories, norms) more heavily than individual mental states (e.g., traits, desires) to predict individual action (Berndt & Heller, 1986; Biernat, 1991; Diesendruck & haLevi, 2006; Kalish, 2002; Kalish & Shiverick, 2004; Lawson & Kalish, 2006; Rhodes & Gelman, 2008; Taylor, 1996).

The entire paper is here.