Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Extremism. Show all posts
Showing posts with label Extremism. Show all posts

Thursday, June 15, 2023

Moralization and extremism robustly amplify myside sharing

Marie, A, Altay, S., et al.
PNAS Nexus, Volume 2, Issue 4, April 2023.

Abstract

We explored whether moralization and attitude extremity may amplify a preference to share politically congruent (“myside”) partisan news and what types of targeted interventions may reduce this tendency. Across 12 online experiments (N = 6,989), we examined decisions to share news touching on the divisive issues of gun control, abortion, gender and racial equality, and immigration. Myside sharing was systematically observed and was consistently amplified when participants (i) moralized and (ii) were attitudinally extreme on the issue. The amplification of myside sharing by moralization also frequently occurred above and beyond that of attitude extremity. These effects generalized to both true and fake partisan news. We then examined a number of interventions meant to curb myside sharing by manipulating (i) the audience to which people imagined sharing partisan news (political friends vs. foes), (ii) the anonymity of the account used (anonymous vs. personal), (iii) a message warning against the myside bias, and (iv) a message warning against the reputational costs of sharing “mysided” fake news coupled with an interactive rating task. While some of those manipulations slightly decreased sharing in general and/or the size of myside sharing, the amplification of myside sharing by moral attitudes was consistently robust to these interventions. Our findings regarding the robust exaggeration of selective communication by morality and extremism offer important insights into belief polarization and the spread of partisan and false information online.

General discussion

Across 12 experiments (N = 6,989), we explored US participants’ intentions to share true and fake partisan news on 5 controversial issues—gun control, abortion, racial equality, sex equality, and immigration—in social media contexts. Our experiments consistently show that people have a strong sharing preference for politically congruent news—Democrats even more so than Republicans. They also demonstrate that this “myside” sharing is magnified when respondents see the issue as being of “absolute moral importance”, and when they have an extreme attitude on the issue. Moreover, issue moralization was found to amplify myside sharing above and beyond attitude extremity in the majority of the studies. Expanding prior research on selective communication, our work provides a clear demonstration that citizens’ myside communicational preference is powerfully amplified by their moral and political ideology (18, 19, 39–43).

By examining this phenomenon across multiple experiments varying numerous parameters, we demonstrated the robustness of myside sharing and of its amplification by participants’ issue moralization and attitude extremity. First, those effects were consistently observed on both true (Experiments 1, 2, 3, 5a, 6a, 7, and 10) and fake (Experiments 4, 5b, 6b, 8, 9, and 10) news stories and across distinct operationalizations of our outcome variable. Moreover, myside sharing and its amplification by issue moralization and attitude extremity were systematically observed despite multiple manipulations of the sharing context. Namely, those effects were observed whether sharing was done from one's personal or an anonymous social media account (Experiments 5a and 5b), whether the audience was made of political friends or foes (Experiments 6a and 6b), and whether participants first saw intervention messages warning against the myside bias (Experiments 7 and 8), or an interactive intervention warning against the reputational costs of sharing mysided falsehoods (Experiments 9 and 10).

Sunday, March 6, 2022

Investigating the role of group-based morality in extreme behavioral expressions of prejudice

Hoover, J., Atari, M., et al. 
Nat Commun 12, 4585 (2021). 
https://doi.org/10.1038/s41467-021-24786-2

Abstract

Understanding motivations underlying acts of hatred are essential for developing strategies to prevent such extreme behavioral expressions of prejudice (EBEPs) against marginalized groups. In this work, we investigate the motivations underlying EBEPs as a function of moral values. Specifically, we propose EBEPs may often be best understood as morally motivated behaviors grounded in people’s moral values and perceptions of moral violations. As evidence, we report five studies that integrate spatial modeling and experimental methods to investigate the relationship between moral values and EBEPs. Our results, from these U.S. based studies, suggest that moral values oriented around group preservation are predictive of the county-level prevalence of hate groups and associated with the belief that extreme behavioral expressions of prejudice against marginalized groups are justified. Additional analyses suggest that the association between group-based moral values and EBEPs against outgroups can be partly explained by the belief that these groups have done something morally wrong.

From the Discussion

Notably, Study 5 provided tentative evidence that binding values may be a particularly important risk factor for the perceived justification of EBEPs. Participants who were experimentally manipulated to believe an outgroup had done something immoral were more likely to perceive acts of hate against that outgroup as justified when they felt that the outgroup’s behavior was more morally wrong. However, this association between PMW and the justification of hate acts was strongly moderated by people’s binding values, but not by their individualizing values. Ultimately, comparing people high on binding values to people high on individualizing values, we found that the average causal mediation effect in the domain of binding values was more than six times the average causal mediation effect in the domain of individualizing values. In other words, our results suggest that if two people see an outgroup’s binding values violation as equally morally wrong, but one of them has higher binding values, the person with higher binding values will see EBEPs against the outgroup as more justified. However, no such difference was observed in the domain of individualizing values.

Accordingly, our results suggest that people who attribute moral violations to an outgroup may be at higher risk for justifying, or perhaps even expressing, extreme prejudice toward outgroups; however, our results also suggest that people who prioritize the binding values may be particularly susceptible to this dynamic when they perceive a violation of ingroup loyalty, respect for authority, and physical or spiritual purity. In this sense, our findings are consistent with the hypothesis that acts of hate—a class of behaviors of which many have received their own special legal designation as particularly heinous crimes4—are partly motivated by individuals’ moral beliefs. This view is well-grounded in current understandings of the relationship between morality and acts of extremism or violence.

Sunday, December 13, 2020

Polarization and extremism emerge from rational choice

Kvam, P. D., & Baldwin, M. 
(2020, October 21).

Abstract

Polarization is often thought to be the product of biased information search, motivated reasoning, or other psychological biases. However, polarization and extremism can still occur in the absence of any bias or irrational thinking. In this paper, we show that polarization occurs among groups of decision makers who are implementing rational choice strategies that maximize decision efficiency. This occurs because extreme information enables decision makers to make up their minds and stop considering new information, whereas moderate information is unlikely to trigger a decision. Furthermore, groups of decision makers will generate extremists -- individuals who hold strong views despite being uninformed and impulsive. In re-analyses of seven previous empirical studies on both perceptual and preferential choice, we show that both polarization and extremism manifest across a wide variety of choice paradigms. We conclude by offering theoretically-motivated interventions that could reduce polarization and extremism by altering the incentives people have when gathering information.

Conclusions

In a decision scenario that incentivizes a trade-off between time and decision quality, a population of rational decision makers will become polarized. In this paper, we have shown this through simulations, a mathematical proof (supplementary materials) and demonstrated it empirically in seven studies.   This  leads  us  to  an  unfortunate  but  unavoidable  conclusion that decision making is a bias-inducing process by which  participants  gather  representative  information  from their environment and, through the decision rules they implement, distort it toward the extremes. Such a process also generates extremists, who hold extreme views and carry undue influence over cultural discourse (Navarro et al.,2018) despite being relatively uninformed and impulsive (low thresh-olds;Kim & Lee,2011). We have suggested several avenues for interventions, foremost among them providing incentives favoring estimation or judgments as opposed to incentives for timely decision making. Our hope is that future work testing and implementing these interventions will reduce the prevalence of polarization and extremism across social domains currently occupied by decision makers.

Sunday, October 6, 2019

Thinking Fast and Furious: Emotional Intensity and Opinion Polarization in Online Media

David Asker & Elias Dinas
Public Opinion Quarterly
Published: 09 September 2019
https://doi.org/10.1093/poq/nfz042

Abstract

How do online media increase opinion polarization? The “echo chamber” thesis points to the role of selective exposure to homogeneous views and information. Critics of this view emphasize the potential of online media to expand the ideological spectrum that news consumers encounter. Embedded in this discussion is the assumption that online media affects public opinion via the range of information that it offers to users. We show that online media can induce opinion polarization even among users exposed to ideologically heterogeneous views, by heightening the emotional intensity of the content. Higher affective intensity provokes motivated reasoning, which in turn leads to opinion polarization. The results of an online experiment focusing on the comments section, a user-driven tool of communication whose effects on opinion formation remain poorly understood, show that participants randomly assigned to read an online news article with a user comments section subsequently express more extreme views on the topic of the article than a control group reading the same article without any comments. Consistent with expectations, this effect is driven by the emotional intensity of the comments, lending support to the idea that motivated reasoning is the mechanism behind this effect.

From the Discussion:

These results should not be taken as a challenge to the echo chamber argument, but rather as a complement to it. Selective exposure to desirable information and motivated rejection of undesirable information constitute separate mechanisms whereby online news audiences may develop more extreme views. Whereas there is already ample empirical evidence about the first mechanism, previous research on the second has been scant. Our contribution should thus be seen as an attempt to fill this gap.

Friday, August 18, 2017

Psychologists surveyed hundreds of alt-right supporters. The results are unsettling.

Brian Resnick
Vox.com
Originally posted August 15, 2017

Here is an excerpt:

The alt-right scores high on dehumanization measures

One of the starkest, darkest findings in the survey comes from a simple question: How evolved do you think other people are?

Kteily, the co-author on this paper, pioneered this new and disturbing way to measure dehumanization — the tendency to see others as being less than human. He simply shows study participants the following (scientifically inaccurate) image of a human ancestor slowly learning how to stand on two legs and become fully human.

Participants are asked to rate where certain groups fall on this scale from 0 to 100. Zero is not human at all; 100 is fully human.

On average, alt-righters saw other groups as hunched-over proto-humans.

On average, they rated Muslims at a 55.4 (again, out of 100), Democrats at 60.4, black people at 64.7, Mexicans at 67.7, journalists at 58.6, Jews at 73, and feminists at 57. These groups appear as subhumans to those taking the survey. And what about white people? They were scored at a noble 91.8. (You can look through all the data here.)

The article is here.

Wednesday, January 28, 2015

Political Extremism Predicts Belief in Conspiracy Theories

By Jan-Willem van Prooijen, André P. M. Krouwel, & Thomas V. Pollet
Social Psychological and Personality Science, January 12, 2015

Abstract

Historical records suggest that the political extremes—at both the “left” and the “right”—substantially endorsed conspiracy beliefs about other-minded groups. The present contribution empirically tests whether extreme political ideologies, at either side of the political spectrum, are positively associated with an increased tendency to believe in conspiracy theories. Four studies conducted in the United States and the Netherlands revealed a quadratic relationship between strength of political ideology and conspiracy beliefs about various political issues. Moreover, participants’ belief in simple political solutions to societal problems mediated conspiracy beliefs among both left- and right-wing extremists. Finally, the effects described here were not attributable to general attitude extremity. Our conclusion is that political extremism and conspiracy beliefs are strongly associated due to a highly structured thinking style that is aimed at making sense of societal events.

The entire article is here.