Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Bias Blind Spot. Show all posts
Showing posts with label Bias Blind Spot. Show all posts

Sunday, October 15, 2023

Bullshit blind spots: the roles of miscalibration and information processing in bullshit detection

Shane Littrell & Jonathan A. Fugelsang
(2023) Thinking & Reasoning
DOI: 10.1080/13546783.2023.2189163

Abstract

The growing prevalence of misleading information (i.e., bullshit) in society carries with it an increased need to understand the processes underlying many people’s susceptibility to falling for it. Here we report two studies (N = 412) examining the associations between one’s ability to detect pseudo-profound bullshit, confidence in one’s bullshit detection abilities, and the metacognitive experience of evaluating potentially misleading information. We find that people with the lowest (highest) bullshit detection performance overestimate (underestimate) their detection abilities and overplace (underplace) those abilities when compared to others. Additionally, people reported using both intuitive and reflective thinking processes when evaluating misleading information. Taken together, these results show that both highly bullshit-receptive and highly bullshit-resistant people are largely unaware of the extent to which they can detect bullshit and that traditional miserly processing explanations of receptivity to misleading information may be insufficient to fully account for these effects.


Here's my summary:

The authors of the article argue that people have two main blind spots when it comes to detecting bullshit: miscalibration and information processing. Miscalibration is the tendency to overestimate our ability to detect bullshit. We think we're better at detecting bullshit than we actually are.

Information processing is the way that we process information in order to make judgments. The authors argue that we are more likely to be fooled by bullshit when we are not paying close attention or when we are processing information quickly.

The authors also discuss some strategies for overcoming these blind spots. One strategy is to be aware of our own biases and limitations. We should also be critical of the information that we consume and take the time to evaluate evidence carefully.

Overall, the article provides a helpful framework for understanding the challenges of bullshit detection. It also offers some practical advice for overcoming these challenges.

Here are some additional tips for detecting bullshit:
  • Be skeptical of claims that seem too good to be true.
  • Look for evidence to support the claims that are being made.
  • Be aware of the speaker or writer's motives.
  • Ask yourself if the claims are making sense and whether they are consistent with what you already know.
  • If you're not sure whether something is bullshit, it's better to err on the side of caution and be skeptical.

Thursday, April 6, 2023

People recognize and condone their own morally motivated reasoning

Cusimano, C., & Lombrozo, T. (2023).
Cognition, 234, 105379.

Abstract

People often engage in biased reasoning, favoring some beliefs over others even when the result is a departure from impartial or evidence-based reasoning. Psychologists have long assumed that people are unaware of these biases and operate under an “illusion of objectivity.” We identify an important domain of life in which people harbor little illusion about their biases – when they are biased for moral reasons. For instance, people endorse and feel justified believing morally desirable propositions even when they think they lack evidence for them (Study 1a/1b). Moreover, when people engage in morally desirable motivated reasoning, they recognize the influence of moral biases on their judgment, but nevertheless evaluate their reasoning as ideal (Studies 2–4). These findings overturn longstanding assumptions about motivated reasoning and identify a boundary condition on Naïve Realism and the Bias Blind Spot. People's tendency to be aware and proud of their biases provides both new opportunities, and new challenges, for resolving ideological conflict and improving reasoning.

Highlights

• Dominant theories assume people form beliefs only under an illusion of objectivity.

• We document a boundary condition on this illusion: morally desirable biases.

• People endorse beliefs they regard as evidentially weak but morally desirable.

• People realize when they have just engaged in morally motivated reasoning.

• Accurate self-attributions of moral bias fully attenuate the ‘bias blind spot’.

From the General discussion

Our beliefs about our beliefs – including whether they are biased or justified – play a crucial role in guiding inquiry, shaping belief revision, and navigating disagreement. One line of research suggests that these judgments are almost universally characterized by an illusion of objectivity such that people consciously reason with the goal of being objective and basing their beliefs on evidence, and because of this, people nearly always assume that their current beliefs meet those standards. Another line of work suggests that people sometimes think that values legitimately bear on whether someone is justified to hold a belief (Cusimano & Lombrozo, 2021b). These findings raise the possibility, consistent with some prior theoretical proposals (Cusimano & Lombrozo, 2021a; Tetlock, 2002), that people will knowingly violate norms of impartiality, or knowingly maintain beliefs that lack evidential support, when doing so advances what they consider to be morally laudable goals. Two predictions follow. First, people should evaluate their beliefs in part based on their perceived moral value. And second, in situations in which people engage in morally motivated reasoning, they should recognize that they have done so and should evaluate their morally motivated reasoning as appropriate. We document support for these predictions across four studies (Table 1).

Conclusion

A great deal of work has assumed that people treat objectivity and evidence-based reasoning as cardinal norms governing their belief formation. This assumption has grown increasingly tenuous in light of recent work highlighting the importance of moral concerns in almost all facets of life. Consistent with this recent work, we find evidence that people’s evaluations of the moral quality of a proposition predict their subjective confidence that it is true, their likelihood of claiming that they believe it and know it, and the extent to which they take their belief to be justified. Moreover, people exhibit metacognitive awareness of this fact and approve of morality’s influence on their reasoning. People often want to be right, but they also want to be good – and they know it.

Wednesday, July 22, 2015

Bias Blind Spot: Structure, Measurement, and Consequences

Irene Scopelliti, Carey K. Morewedge, Erin McCormick, H. Lauren Min, Sophie Lebrecht, Karim S. Kassam (2015)
Bias Blind Spot: Structure, Measurement, and Consequences. Management Science
Published online in Articles in Advance 24 Apr 2015
http://dx.doi.org/10.1287/mnsc.2014.2096

Abstract

People exhibit a bias blind spot: they are less likely to detect bias in themselves than in others. We report the development and validation of an instrument to measure individual differences in the propensity to exhibit the bias blind spot that is unidimensional, internally consistent, has high test-retest reliability, and is discriminated from measures of intelligence, decision-making ability, and personality traits related to self-esteem, self-enhancement, and self-presentation. The scale is predictive of the extent to which people judge their abilities to be better than average for easy tasks and worse than average for difficult tasks, ignore the advice of others, and are responsive to an intervention designed to mitigate a different judgmental bias. These results suggest that the bias blind spot is a distinct metabias resulting from naïve realism rather than other forms of egocentric
cognition, and has unique effects on judgment and behavior.

The entire article is here.