Pennycook, G., & Rand, D. G.
(2020, November 18).
We synthesize a burgeoning literature investigating why people believe and share “fake news” and other misinformation online. Surprisingly, the evidence contradicts a common narrative whereby partisanship and politically motivated reasoning explain failures to discern truth from falsehood. Instead, poor truth discernment is linked to a lack of careful reasoning and relevant knowledge, and to the use of familiarity and other heuristics. Furthermore, there is a substantial disconnect between what people believe and what they will share on social media. This dissociation is largely driven by inattention, rather than purposeful sharing of misinformation. As a result, effective interventions can nudge social media users to think about accuracy, and can leverage crowdsourced veracity ratings to improve social media ranking algorithms.
From the Discussion
Indeed, recent research shows that a simple accuracy nudge intervention –specifically, having participants rate the accuracy of a single politically neutral headline (ostensibly as part of a pretest) prior to making judgments about social media sharing –improves the extent to which people discern between true and false news content when deciding what to share online in survey experiments. This approach has also been successfully deployed in a large-scale field experiment on Twitter, in which messages asking users to rate the accuracy of a random headline were sent to thousands of accounts who recently shared links to misinformation sites. This subtle nudge significantly increased the quality of the content they subsequently shared; see Figure3B. Furthermore, survey experiments have shown that asking participants to explain how they know if a headline is true of false before sharing it increases sharing discernment, and having participants rate accuracy at the time of encoding protects against familiarity effects."