Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label political bias. Show all posts
Showing posts with label political bias. Show all posts

Wednesday, November 22, 2023

The case for partisan motivated reasoning

Williams, D.
Synthese 202, 89 (2023).

Abstract

A large body of research in political science claims that the way in which democratic citizens think about politics is motivationally biased by partisanship. Numerous critics argue that the evidence for this claim is better explained by theories in which party allegiances influence political cognition without motivating citizens to embrace biased beliefs. This article has three aims. First, I clarify this criticism, explain why common responses to it are unsuccessful, and argue that to make progress on this debate we need a more developed theory of the connections between group attachments and motivated reasoning. Second, I develop such a theory. Drawing on research on coalitional psychology and the social functions of beliefs, I argue that partisanship unconsciously biases cognition by generating motivations to advocate for party interests, which transform individuals into partisan press secretaries. Finally, I argue that this theory offers a superior explanation of a wide range of relevant findings than purely non-motivational theories of political cognition.

My summary:

Partisan motivated reasoning is the tendency for people to seek out and interpret information in a way that confirms their existing political beliefs. This is a complex phenomenon, but Williams argues that it can be explained by the combination of two factors:

  1. Group attachments: People are strongly motivated to defend and promote the interests of their social groups, including their political parties.
  2. Motivated cognition: People are motivated to believe things that are true, but they are also motivated to believe things that are consistent with their values and goals.
Williams argues that partisan motivated reasoning is a natural and predictable consequence of these two factors. When people are motivated to defend and promote their political party, they will be motivated to seek out and interpret information in a way that confirms their existing beliefs. They will also be motivated to downplay or ignore information that is inconsistent with their beliefs.

Williams provides a number of pieces of evidence to support his argument, including studies that show that people are more likely to believe information that is consistent with their political beliefs, even when that information is false. He also shows that people are more likely to seek out and consume information from sources that they agree with politically.

Williams concludes by arguing that partisan motivated reasoning is a serious problem for democracy. It can lead to people making decisions that are not in their own best interests, and it can make it difficult for people to have productive conversations about political issues.

Thursday, June 15, 2023

Moralization and extremism robustly amplify myside sharing

Marie, A, Altay, S., et al.
PNAS Nexus, Volume 2, Issue 4, April 2023.

Abstract

We explored whether moralization and attitude extremity may amplify a preference to share politically congruent (“myside”) partisan news and what types of targeted interventions may reduce this tendency. Across 12 online experiments (N = 6,989), we examined decisions to share news touching on the divisive issues of gun control, abortion, gender and racial equality, and immigration. Myside sharing was systematically observed and was consistently amplified when participants (i) moralized and (ii) were attitudinally extreme on the issue. The amplification of myside sharing by moralization also frequently occurred above and beyond that of attitude extremity. These effects generalized to both true and fake partisan news. We then examined a number of interventions meant to curb myside sharing by manipulating (i) the audience to which people imagined sharing partisan news (political friends vs. foes), (ii) the anonymity of the account used (anonymous vs. personal), (iii) a message warning against the myside bias, and (iv) a message warning against the reputational costs of sharing “mysided” fake news coupled with an interactive rating task. While some of those manipulations slightly decreased sharing in general and/or the size of myside sharing, the amplification of myside sharing by moral attitudes was consistently robust to these interventions. Our findings regarding the robust exaggeration of selective communication by morality and extremism offer important insights into belief polarization and the spread of partisan and false information online.

General discussion

Across 12 experiments (N = 6,989), we explored US participants’ intentions to share true and fake partisan news on 5 controversial issues—gun control, abortion, racial equality, sex equality, and immigration—in social media contexts. Our experiments consistently show that people have a strong sharing preference for politically congruent news—Democrats even more so than Republicans. They also demonstrate that this “myside” sharing is magnified when respondents see the issue as being of “absolute moral importance”, and when they have an extreme attitude on the issue. Moreover, issue moralization was found to amplify myside sharing above and beyond attitude extremity in the majority of the studies. Expanding prior research on selective communication, our work provides a clear demonstration that citizens’ myside communicational preference is powerfully amplified by their moral and political ideology (18, 19, 39–43).

By examining this phenomenon across multiple experiments varying numerous parameters, we demonstrated the robustness of myside sharing and of its amplification by participants’ issue moralization and attitude extremity. First, those effects were consistently observed on both true (Experiments 1, 2, 3, 5a, 6a, 7, and 10) and fake (Experiments 4, 5b, 6b, 8, 9, and 10) news stories and across distinct operationalizations of our outcome variable. Moreover, myside sharing and its amplification by issue moralization and attitude extremity were systematically observed despite multiple manipulations of the sharing context. Namely, those effects were observed whether sharing was done from one's personal or an anonymous social media account (Experiments 5a and 5b), whether the audience was made of political friends or foes (Experiments 6a and 6b), and whether participants first saw intervention messages warning against the myside bias (Experiments 7 and 8), or an interactive intervention warning against the reputational costs of sharing mysided falsehoods (Experiments 9 and 10).