Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Fact Checking. Show all posts
Showing posts with label Fact Checking. Show all posts

Sunday, December 6, 2020

The Value of Not Knowing: Partisan Cue-Taking and Belief Updating of the Uninformed, the Ambiguous, and the Misinformed

Jianing Li & Michael W Wagner
Journal of Communication, Volume 70,
Issue 5, October 2020, Pages 646–669.

Abstract

The problem of a misinformed citizenry is often used to motivate research on misinformation and its corrections. However, researchers know little about how differences in informedness affect how well corrective information helps individuals develop knowledge about current events. We introduce a Differential Informedness Model that distinguishes between three types of individuals, that is, the uninformed, the ambiguous, and the misinformed, and establish their differences with two experiments incorporating multiple partisan cues and issues. Contrary to the common impression, the U.S. public is largely uninformed rather than misinformed of a wide range of factual claims verified by journalists. Importantly, we find that the success of belief updating after exposure to corrective information (via a fact-checking article) is dependent on the presence, the certainty, and the accuracy of one’s prior belief. Uninformed individuals are more likely to update their beliefs than misinformed individuals after exposure to corrective information. Interestingly, the ambiguous individuals, regardless of whether their uncertain guesses were correct, do not differ from uninformed individuals with respect to belief updating.

From the Discussion Section

First, and contrary to the impression that many citizens are misinformed, the majority of our respondents are uninformed of a wide range of claims important enough to be verified by journalists. Only a small group of respondents hold confident, inaccurate beliefs.  This builds on the work of Pasek et al. (2015) by distinguishing between the uninformed, who admit that they “don’t know,” the ambiguous, who take a guess with varying degrees of accuracy, and the misinformed, who hold steadfast false beliefs. In the current environment where concerns over misinformation often lead to heightened attention to belief accuracy, our findings highlight the necessity to bridge between work on political ignorance and misperception and the benefit of leveraging belief accuracy, belief presence and belief certainty to better assess public informedness.

(emphasis added)

Thursday, December 5, 2019

How Misinformation Spreads--and Why We Trust It

Cailin O'Connor and James Owen Weatherall
Scientific American
Originally posted September 2019

Here is an excerpt:

Many communication theorists and social scientists have tried to understand how false beliefs persist by modeling the spread of ideas as a contagion. Employing mathematical models involves simulating a simplified representation of human social interactions using a computer algorithm and then studying these simulations to learn something about the real world. In a contagion model, ideas are like viruses that go from mind to mind.

You start with a network, which consists of nodes, representing individuals, and edges, which represent social connections.  You seed an idea in one “mind” and see how it spreads under various assumptions about when transmission will occur.

Contagion models are extremely simple but have been used to explain surprising patterns of behavior, such as the epidemic of suicide that reportedly swept through Europe after publication of Goethe's The Sorrows of Young Werther in 1774 or when dozens of U.S. textile workers in 1962 reported suffering from nausea and numbness after being bitten by an imaginary insect. They can also explain how some false beliefs propagate on the Internet.

Before the last U.S. presidential election, an image of a young Donald Trump appeared on Facebook. It included a quote, attributed to a 1998 interview in People magazine, saying that if Trump ever ran for president, it would be as a Republican because the party is made up of “the dumbest group of voters.” Although it is unclear who “patient zero” was, we know that this meme passed rapidly from profile to profile.

The meme's veracity was quickly evaluated and debunked. The fact-checking Web site Snopes reported that the quote was fabricated as early as October 2015. But as with the tomato hornworm, these efforts to disseminate truth did not change how the rumors spread. One copy of the meme alone was shared more than half a million times. As new individuals shared it over the next several years, their false beliefs infected friends who observed the meme, and they, in turn, passed the false belief on to new areas of the network.

This is why many widely shared memes seem to be immune to fact-checking and debunking. Each person who shared the Trump meme simply trusted the friend who had shared it rather than checking for themselves.

Putting the facts out there does not help if no one bothers to look them up. It might seem like the problem here is laziness or gullibility—and thus that the solution is merely more education or better critical thinking skills. But that is not entirely right.

Sometimes false beliefs persist and spread even in communities where everyone works very hard to learn the truth by gathering and sharing evidence. In these cases, the problem is not unthinking trust. It goes far deeper than that.

The info is here.