Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label science communication. Show all posts
Showing posts with label science communication. Show all posts

Sunday, October 16, 2022

A framework for understanding reasoning errors: From fake news to climate change and beyond

Pennycook, G. (2022, August 31).
https://doi.org/10.31234/osf.io/j3w7d

Abstract

Humans have the capacity, but perhaps not always the willingness, for great intelligence. From global warming to the spread of misinformation and beyond, our species is facing several major challenges that are the result of the limits of our own reasoning and decision-making. So, why are we so prone to errors during reasoning? In this chapter, I will outline a framework for understanding reasoning errors that is based on a three-stage dual-process model of analytic engagement (intuition, metacognition, and reason). The model has two key implications: 1) That a mere lack of deliberation and analytic thinking is a primary source of errors and 2) That when deliberation is activated, it generally reduces errors (via questioning intuitions and integrating new information) than increasing errors (via rationalization and motivated reasoning). In support of these claims, I review research showing the extensive predictive validity of measures that index individual differences in analytic cognitive style – even beyond explicit errors per se. In particular, analytic thinking is not only predictive of skepticism about a wide range of epistemically suspect beliefs (paranormal, conspiratorial, COVID-19 misperceptions, pseudoscience and alternative medicines) as well as decreased susceptibility to bullshit, fake news, and misinformation, but also important differences in people’s moral judgments and values as well as their religious beliefs (and disbeliefs). Furthermore, in some (but not all cases), there is evidence from experimental paradigms that support a causal role of analytic thinking in determining judgments, beliefs, and behaviors. The findings reviewed here provide some reason for optimism for the future: It may be possible to foster analytic thinking and therefore improve the quality of our decisions.

Evaluating the evidence: Does reason matter?

Thus far, I have prioritized explaining the various alternative frameworks. I will now turn to an in-depth review of some of the key relevant evidence that helps mediate between these accounts. I will organize this review around two key implications that emerge from the framework that I have proposed.

First, the primary difference between the three-stage model (and related dual-process models) and the social-intuitionist models (and related intuitionist models) is that the former argues that people should be able to overcome intuitive errors using deliberation whereas the latter argues that reason is generally infirm and therefore that intuitive errors will simply dominate. Thus, the reviewed research will investigate the apparent role of deliberation in driving people’s choices, beliefs, and behaviors.

Second, the primary difference between the three-stage model (and related dual-process models) and the identity-protective cognition model is that the latter argues that deliberation facilitates biased information processing whereas the former argues that deliberation generally facilitates accuracy. Thus, the reviewed research will also focus on whether deliberation is linked with inaccuracy in politically-charged or identity-relevant contexts.

Monday, August 8, 2022

Why are people antiscience, and what can we do about it?

Phillipp-Muller, A, Lee, W.S., & Petty, R. E.
PNAS (2022). 
DOI: 10.1073/pnas.2120755119.

Abstract

From vaccination refusal to climate change denial, antiscience views are threatening humanity. When different individuals are provided with the same piece of scientific evidence, why do some accept whereas others dismiss it? Building on various emerging data and models that have explored the psychology of being antiscience, we specify four core bases of key principles driving antiscience attitudes. These principles are grounded in decades of research on attitudes, persuasion, social influence, social identity, and information processing. They apply across diverse domains of antiscience phenomena. Specifically, antiscience attitudes are more likely to emerge when a scientific message comes from sources perceived as lacking credibility; when the recipients embrace the social membership or identity of groups with antiscience attitudes; when the scientific message itself contradicts what recipients consider true, favorable, valuable, or moral; or when there is a mismatch between the delivery of the scientific message and the epistemic style of the recipient. Politics triggers or amplifies many principles across all four bases, making it a particularly potent force in antiscience attitudes. Guided by the key principles, we describe evidence-based counteractive strategies for increasing public acceptance of science.

Concluding Remarks

By offering an inclusive framework of key principles underlying antiscience attitudes, we aim to advance theory and research on several fronts: Our framework highlights basic principles applicable to antiscience phenomena across multiple domains of science. It predicts situational and personal variables (e.g., moralization, attitude strength, and need for closure) that amplify people’s likelihood and intensity of being antiscience. It unpacks why politics is such a potent force with multiple aspects of influence on antiscience attitudes. And it suggests a range of counteractive strategies that target each of the four bases. Beyond explaining, predicting, and addressing antiscience views, our framework raises unresolved questions for future research.

With the prevalence of antiscience attitudes, scientists and science communicators face strong headwinds in gaining and sustaining public trust and in conveying scientific information in ways that will be accepted and integrated into public understanding. It is a multifaceted problem that ranges from erosions in the credibility of scientists to conflicts with the identities, beliefs, attitudes, values, morals, and epistemic styles of different portions of the population, exacerbated by the toxic ecosystem of the politics of our time. Scientific information can be difficult to swallow, and many individuals would sooner reject the evidence than accept information that suggests they might have been wrong. This inclination is wholly understandable, and scientists should be poised to empathize. After all, we are in the business of being proven wrong, but that must not stop us from helping people get things right.