Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label science communication. Show all posts
Showing posts with label science communication. Show all posts

Sunday, September 14, 2025

Cyber anti-intellectualism and science communication during the COVID-19 pandemic: a cross-sectional study

Kuang Y. (2025).
Frontiers in public health, 12, 1491096.

Abstract

Background
During the COVID-19 pandemic, science communication played a crucial role in disseminating accurate information and promoting scientific literacy among the public. However, the rise of anti-intellectualism on social media platforms has posed significant challenges to science, scientists, and science communication, hindering effective public engagement with scientific affairs. This study aims to explore the mechanisms through which anti-intellectualism impacts science communication on social media platforms from the perspective of communication effect theory.

Method
This study employed a cross-sectional research design to conduct an online questionnaire survey of Chinese social media users from August to September 2021. The survey results were analyzed via descriptive statistics, t-tests, one-way ANOVA, and a chain mediation model with SPSS 26.0.

Results
There were significant differences in anti-intellectualism tendency among groups of different demographic characteristics. The majority of respondents placed greater emphasis on knowledge that has practical benefits in life. Respondents’ trust in different groups of intellectuals showed significant inconsistencies, with economists and experts receiving the lowest levels of trust. Anti-intellectualism significantly and positively predicted the level of misconception of scientific and technological information, while significantly and negatively predicting individuals’ attitudes toward science communication. It further influenced respondents’ behavior in disseminating scientific and technological information through the chain mediation of scientific misconception and attitudes toward science communication.

Conclusion
This research enriches the conceptual framework of anti-intellectualism across various cultural contexts, as well as the theoretical framework concerning the interaction between anti-intellectualism and science communication. The findings provide suggestions for developing strategies to enhance the effectiveness of science communication and risk communication during public emergencies.

Here are some thoughts:

When people distrust science and intellectuals — especially on social media — it leads to misunderstanding of scientific facts, negative attitudes toward science communication, and reduced sharing of accurate information. This harms public health efforts, particularly during emergencies like the COVID-19 pandemic. To combat this, science communication must become more inclusive, transparent, and focused on real-world benefits, and experts must engage the public as equals, not just as authority figures. 

Editorial finale: Social media "wellness influencers" typically have a financial incentive to sell unproven or even harmful interventions because our current healthcare system is so expensive and so broken. Wellness influencers's power lies in the promise, the hope, and the price, not the outcome of the intervention.

Sunday, June 2, 2024

The Honest Broker versus the Epistocrat: Attenuating Distrust in Science by Disentangling Science from Politics

Senja Post & Nils Bienzeisler (2024)
Political Communication
DOI: 10.1080/10584609.2024.2317274

Abstract

People’s trust in science is generally high. Yet in public policy disputes invoking scientific issues, people’s trust in science is typically polarized, aligned with their political preferences. Theorists of science and democracy have reasoned that a polarization of trust in scientific information could be mitigated by clearly disentangling scientific claims from political ones. We tested this proposition experimentally in three German public policy disputes: a) school closures versus openings during the COVID-19 pandemic, b) a ban on versus a continuation of domestic air traffic in view of climate change, and c) the shooting of wolves in residential areas or their protection. In each case study, we exposed participants to one of four versions of a news item citing a scientist reporting their research and giving policy advice. The scientist’s quotes differed with regard to the direction and style of their policy advice. As an epistocrat, the scientist blurs the distinction between scientific and political claims, purporting to “prove” a policy and thereby precluding a societal debate over values and policy priorities. As an honest broker, the scientist distinguishes between scientific and political claims, presenting a policy option while acknowledging the limitations of their disciplinary scientific perspective of a broader societal problem. We find that public policy advice in the style of an honest broker versus that of an epistocrat can attenuate political polarization of trust in scientists and scientific findings by enhancing trust primarily among the most politically challenged.


Here is a summary:

This article dives into the issue of distrust in science and proposes a solution: scientists acting as "honest brokers".

The article contrasts two approaches scientists can take when communicating scientific findings for policy purposes.  An "epistocrat" scientist blurs the lines between science and politics. They present a specific policy recommendation based on their research, implying that this is the only logical course of action. This doesn't acknowledge the role of values and priorities in policy decisions, and can shut down public debate.

On the other hand, an "honest broker" scientist makes a clear distinction between science and politics. They present their research findings and the policy options that stem from them, but acknowledge the limitations of science in addressing broader societal issues. This approach allows for a public discussion about values and priorities, which can help build trust in science especially among those who might not agree with the scientist's political views.

The article suggests that by following the "honest broker" approach, scientists can help reduce the political polarization of trust in science. This means presenting the science clearly and openly, and allowing for a public conversation about how those findings should be applied.

Sunday, October 16, 2022

A framework for understanding reasoning errors: From fake news to climate change and beyond

Pennycook, G. (2022, August 31).
https://doi.org/10.31234/osf.io/j3w7d

Abstract

Humans have the capacity, but perhaps not always the willingness, for great intelligence. From global warming to the spread of misinformation and beyond, our species is facing several major challenges that are the result of the limits of our own reasoning and decision-making. So, why are we so prone to errors during reasoning? In this chapter, I will outline a framework for understanding reasoning errors that is based on a three-stage dual-process model of analytic engagement (intuition, metacognition, and reason). The model has two key implications: 1) That a mere lack of deliberation and analytic thinking is a primary source of errors and 2) That when deliberation is activated, it generally reduces errors (via questioning intuitions and integrating new information) than increasing errors (via rationalization and motivated reasoning). In support of these claims, I review research showing the extensive predictive validity of measures that index individual differences in analytic cognitive style – even beyond explicit errors per se. In particular, analytic thinking is not only predictive of skepticism about a wide range of epistemically suspect beliefs (paranormal, conspiratorial, COVID-19 misperceptions, pseudoscience and alternative medicines) as well as decreased susceptibility to bullshit, fake news, and misinformation, but also important differences in people’s moral judgments and values as well as their religious beliefs (and disbeliefs). Furthermore, in some (but not all cases), there is evidence from experimental paradigms that support a causal role of analytic thinking in determining judgments, beliefs, and behaviors. The findings reviewed here provide some reason for optimism for the future: It may be possible to foster analytic thinking and therefore improve the quality of our decisions.

Evaluating the evidence: Does reason matter?

Thus far, I have prioritized explaining the various alternative frameworks. I will now turn to an in-depth review of some of the key relevant evidence that helps mediate between these accounts. I will organize this review around two key implications that emerge from the framework that I have proposed.

First, the primary difference between the three-stage model (and related dual-process models) and the social-intuitionist models (and related intuitionist models) is that the former argues that people should be able to overcome intuitive errors using deliberation whereas the latter argues that reason is generally infirm and therefore that intuitive errors will simply dominate. Thus, the reviewed research will investigate the apparent role of deliberation in driving people’s choices, beliefs, and behaviors.

Second, the primary difference between the three-stage model (and related dual-process models) and the identity-protective cognition model is that the latter argues that deliberation facilitates biased information processing whereas the former argues that deliberation generally facilitates accuracy. Thus, the reviewed research will also focus on whether deliberation is linked with inaccuracy in politically-charged or identity-relevant contexts.

Monday, August 8, 2022

Why are people antiscience, and what can we do about it?

Phillipp-Muller, A, Lee, W.S., & Petty, R. E.
PNAS (2022). 
DOI: 10.1073/pnas.2120755119.

Abstract

From vaccination refusal to climate change denial, antiscience views are threatening humanity. When different individuals are provided with the same piece of scientific evidence, why do some accept whereas others dismiss it? Building on various emerging data and models that have explored the psychology of being antiscience, we specify four core bases of key principles driving antiscience attitudes. These principles are grounded in decades of research on attitudes, persuasion, social influence, social identity, and information processing. They apply across diverse domains of antiscience phenomena. Specifically, antiscience attitudes are more likely to emerge when a scientific message comes from sources perceived as lacking credibility; when the recipients embrace the social membership or identity of groups with antiscience attitudes; when the scientific message itself contradicts what recipients consider true, favorable, valuable, or moral; or when there is a mismatch between the delivery of the scientific message and the epistemic style of the recipient. Politics triggers or amplifies many principles across all four bases, making it a particularly potent force in antiscience attitudes. Guided by the key principles, we describe evidence-based counteractive strategies for increasing public acceptance of science.

Concluding Remarks

By offering an inclusive framework of key principles underlying antiscience attitudes, we aim to advance theory and research on several fronts: Our framework highlights basic principles applicable to antiscience phenomena across multiple domains of science. It predicts situational and personal variables (e.g., moralization, attitude strength, and need for closure) that amplify people’s likelihood and intensity of being antiscience. It unpacks why politics is such a potent force with multiple aspects of influence on antiscience attitudes. And it suggests a range of counteractive strategies that target each of the four bases. Beyond explaining, predicting, and addressing antiscience views, our framework raises unresolved questions for future research.

With the prevalence of antiscience attitudes, scientists and science communicators face strong headwinds in gaining and sustaining public trust and in conveying scientific information in ways that will be accepted and integrated into public understanding. It is a multifaceted problem that ranges from erosions in the credibility of scientists to conflicts with the identities, beliefs, attitudes, values, morals, and epistemic styles of different portions of the population, exacerbated by the toxic ecosystem of the politics of our time. Scientific information can be difficult to swallow, and many individuals would sooner reject the evidence than accept information that suggests they might have been wrong. This inclination is wholly understandable, and scientists should be poised to empathize. After all, we are in the business of being proven wrong, but that must not stop us from helping people get things right.