Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Dual Process Theory. Show all posts
Showing posts with label Dual Process Theory. Show all posts

Sunday, October 15, 2023

Bullshit blind spots: the roles of miscalibration and information processing in bullshit detection

Shane Littrell & Jonathan A. Fugelsang
(2023) Thinking & Reasoning
DOI: 10.1080/13546783.2023.2189163

Abstract

The growing prevalence of misleading information (i.e., bullshit) in society carries with it an increased need to understand the processes underlying many people’s susceptibility to falling for it. Here we report two studies (N = 412) examining the associations between one’s ability to detect pseudo-profound bullshit, confidence in one’s bullshit detection abilities, and the metacognitive experience of evaluating potentially misleading information. We find that people with the lowest (highest) bullshit detection performance overestimate (underestimate) their detection abilities and overplace (underplace) those abilities when compared to others. Additionally, people reported using both intuitive and reflective thinking processes when evaluating misleading information. Taken together, these results show that both highly bullshit-receptive and highly bullshit-resistant people are largely unaware of the extent to which they can detect bullshit and that traditional miserly processing explanations of receptivity to misleading information may be insufficient to fully account for these effects.


Here's my summary:

The authors of the article argue that people have two main blind spots when it comes to detecting bullshit: miscalibration and information processing. Miscalibration is the tendency to overestimate our ability to detect bullshit. We think we're better at detecting bullshit than we actually are.

Information processing is the way that we process information in order to make judgments. The authors argue that we are more likely to be fooled by bullshit when we are not paying close attention or when we are processing information quickly.

The authors also discuss some strategies for overcoming these blind spots. One strategy is to be aware of our own biases and limitations. We should also be critical of the information that we consume and take the time to evaluate evidence carefully.

Overall, the article provides a helpful framework for understanding the challenges of bullshit detection. It also offers some practical advice for overcoming these challenges.

Here are some additional tips for detecting bullshit:
  • Be skeptical of claims that seem too good to be true.
  • Look for evidence to support the claims that are being made.
  • Be aware of the speaker or writer's motives.
  • Ask yourself if the claims are making sense and whether they are consistent with what you already know.
  • If you're not sure whether something is bullshit, it's better to err on the side of caution and be skeptical.

Saturday, October 14, 2023

Overconfidently conspiratorial: Conspiracy believers are dispositionally overconfident and massively overestimate how much others agree with them

Pennycook, G., Binnendyk, J., & Rand, D. G. 
(2022, December 5). PsyArXiv

Abstract

There is a pressing need to understand belief in false conspiracies. Past work has focused on the needs and motivations of conspiracy believers, as well as the role of overreliance on intuition. Here, we propose an alternative driver of belief in conspiracies: overconfidence. Across eight studies with 4,181 U.S. adults, conspiracy believers not only relied more intuition, but also overestimated their performance on numeracy and perception tests (i.e. were overconfident in their own abilities). This relationship with overconfidence was robust to controlling for analytic thinking, need for uniqueness, and narcissism, and was strongest for the most fringe conspiracies. We also found that conspiracy believers – particularly overconfident ones – massively overestimated (>4x) how much others agree with them: Although conspiracy beliefs were in the majority in only 12% of 150 conspiracies across three studies, conspiracy believers thought themselves to be in the majority 93% of the time.

Here is my summary:

The research found that people who believe in conspiracy theories are more likely to be overconfident in their own abilities and to overestimate how much others agree with them. This was true even when controlling for other factors, such as analytic thinking, need for uniqueness, and narcissism.

The researchers conducted a series of studies to test their hypothesis. In one study, they found that people who believed in conspiracy theories were more likely to overestimate their performance on numeracy and perception tests. In another study, they found that people who believed in conspiracy theories were more likely to overestimate how much others agreed with them about a variety of topics, including climate change and the 2016 US presidential election.

The researchers suggest that overconfidence may play a role in the formation and maintenance of conspiracy beliefs. When people are overconfident, they are more likely to dismiss evidence that contradicts their beliefs and to seek out information that confirms their beliefs. This can lead to a "filter bubble" effect, where people are only exposed to information that reinforces their existing beliefs.

The researchers also suggest that overconfidence may lead people to overestimate how much others agree with them about their conspiracy beliefs. This can make them feel more confident in their beliefs and less likely to question them.

The findings of this research have implications for understanding and addressing the spread of conspiracy theories. It is important to be aware of the role that overconfidence may play in the formation and maintenance of conspiracy beliefs. This knowledge can be used to develop more effective interventions to prevent people from falling for conspiracy theories and to help people who already believe in conspiracy theories to critically evaluate their beliefs.

Monday, October 17, 2022

The Psychological Origins of Conspiracy Theory Beliefs: Big Events with Small Causes Amplify Conspiratorial Thinking

Vonasch, A., Dore, N., & Felicite, J.
(2022, January 20). 
https://doi.org/10.31234/osf.io/3j9xg

Abstract

Three studies supported a new model of conspiracy theory belief: People are most likely to believe conspiracy theories that explain big, socially important events with smaller, intuitively unappealing official explanations. Two experiments (N = 577) used vignettes about fictional conspiracy theories and measured online participants’ beliefs in the official causes of the events and the corresponding conspiracy theories. We experimentally manipulated the size of the event and its official cause. Larger events and small official causes decreased belief in the official cause and this mediated increased belief in the conspiracy theory, even after controlling for individual differences in paranoia and distrust. Study 3 established external validity and generalizability by coding the 78 most popular conspiracy theories on Reddit. Nearly all (96.7%) popular conspiracy theories explain big, socially important events with smaller, intuitively unappealing official explanations. By contrast, events not producing conspiracy theories often have bigger explanations.

General Discussion

Three studies supported the HOSE (heuristic of sufficient explanation) of conspiracy theory belief. Nearly all popular conspiracy theories sampled were about major events with small official causes deemed too small to sufficiently explain the event. Two experiments involving invented conspiracy theories supported the proposed causal mechanism. People were less likely to believe the official explanation was true because it was relatively small and the event was relatively big. People’s beliefs in the conspiracy theory were mediated by their disbelief in the official explanation. Thus, one reason people believe conspiracy theories is because they offer a bigger explanation for a seemingly implausibly large effect of a small cause.

HOSE helps explain why certain conspiracy theories become popular but others do not. Like evolutionarily fit genes are especially likely to spread to subsequent generations, ideas (memes) with certain qualities are most likely to spread and thus become popular (Dawkins, 1976). HOSE explains that conspiracy theories spread widely because people are strongly motivated to learn an explanation for important events (Douglas, et al., 2017; 2019), and are usually unsatisfied with counterintuitively small explanations that seem insufficient to explain things. Conspiracy theories are typically inspired by events that people perceive to be larger than their causes could plausibly produce. Some conspiracy theories may be inevitable because small causes do sometimes counterintuitively cause big events: via the exponential spread of a microscopic virus or the interconnected, chaotic nature of events like the flap of a butterfly’s wings changing weather across the world (Gleick, 2008). Therefore, itmay be impossible to prevent all conspiracy theories from developing.