Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Mistrust. Show all posts
Showing posts with label Mistrust. Show all posts

Tuesday, May 2, 2023

Lies and bullshit: The negative effects of misinformation grow stronger over time

Petrocelli, J. V., Seta, C. E., & Seta, J. J. (2023). 
Applied Cognitive Psychology, 37(2), 409–418. 
https://doi.org/10.1002/acp.4043

Abstract

In a world where exposure to untrustworthy communicators is common, trust has become more important than ever for effective marketing. Nevertheless, we know very little about the long-term consequences of exposure to untrustworthy sources, such bullshitters. This research examines how untrustworthy sources—liars and bullshitters—influence consumer attitudes toward a product. Frankfurt's (1986) insidious bullshit hypothesis (i.e., bullshitting is evaluated less negatively than lying but bullshit can be more harmful than are lies) is examined within a traditional sleeper effect—a persuasive influence that increases, rather than decays over time. We obtained a sleeper effect after participants learned that the source of the message was either a liar or a bullshitter. However, compared to the liar source condition, the same message from a bullshitter resulted in more extreme immediate and delayed attitudes that were in line with an otherwise discounted persuasive message (i.e., an advertisement). Interestingly, attitudes returned to control condition levels when a bullshitter was the source of the message, suggesting that knowing an initially discounted message may be potentially accurate/inaccurate (as is true with bullshit, but not lies) does not result in the long-term discounting of that message. We discuss implications for marketing and other contexts of persuasion.

General Discussion

There is a considerable body of knowledge about the antecedents and consequences of lying in marketing and other contexts (e.g., Ekman, 1985), but much less is known about the other untrustworthy source: The Bullshitter. The current investigation suggests that the distinction between bullshitting and lying is important to marketing and to persuasion more generally. People are exposed to scores of lies and bullshit every day and this exposure has increased dramatically as the use of the internet has shifted from a platform for socializing to a source of information (e.g., Di Domenico et al., 2021). Because things such as truth status and source status fade faster than familiarity, illusory truth effects for consumer products can emerge after only 3 days post-initial exposure (Skurnik et al., 2005), and within the hour for basic knowledge questions (Fazio et al., 2015). As mirrored in our conditions that received discounting cues after the initial attitude information, at times people are lied to, or bullshitted, and only learn afterwards they were deceived. It is then that these untrustworthy sources appear to have a sleeper effect creating unwarranted and undiscounted attitudes.

It should be noted that our data do not suggest that the impact of lie and bullshit discounting cues fade differentially. However, the discounting cue in the bullshit condition had less of an immediate and long-term suppression effect than in the lie condition. In fact, after 14 days, the bullshit communication not only had more of an influence on attitudes, but the influence was not significantly different from that of the control communication. This finding suggests that bullshit can be more insidious than lies. As it relates to marketing, the insidious nature of exposure to bullshit can create false beliefs that subsequently affect behavior, even when people have been told that the information came from a person known to spread bullshit. The insidious nature of bullshit is magnified by the fact that even when it is clear that one is expressing his/her opinion via bullshit, people do not appear to hold the bullshitter to the same standard as the liar (Frankfurt, 1986). People may think that at least the bullshitter often believes his/her own bullshit, whereas the liar knows his/her statement is not true (Bernal, 2006; Preti, 2006; Reisch, 2006). Because of this difference, what may appear to be harmless communications from a bullshitter may have serious repercussions for consumers and organizations. Additionally, along with the research of Foos et al. (2016), the present research suggests that the harmful influence of untrustworthy sources may not be recognized initially but appears over time. The present research suggests that efforts to fight the consequences of fake news (see Atkinson, 2019) are more difficult because of the sleeper effect. The negative effects of unsubstantiated or false information may not only persist but may grow stronger over time.

Monday, March 22, 2021

The Mistrust of Science

Atul Gawande
The New Yorker
Originally posted 01 June 2016

Here is an excerpt:

The scientific orientation has proved immensely powerful. It has allowed us to nearly double our lifespan during the past century, to increase our global abundance, and to deepen our understanding of the nature of the universe. Yet scientific knowledge is not necessarily trusted. Partly, that’s because it is incomplete. But even where the knowledge provided by science is overwhelming, people often resist it—sometimes outright deny it. Many people continue to believe, for instance, despite massive evidence to the contrary, that childhood vaccines cause autism (they do not); that people are safer owning a gun (they are not); that genetically modified crops are harmful (on balance, they have been beneficial); that climate change is not happening (it is).

Vaccine fears, for example, have persisted despite decades of research showing them to be unfounded. Some twenty-five years ago, a statistical analysis suggested a possible association between autism and thimerosal, a preservative used in vaccines to prevent bacterial contamination. The analysis turned out to be flawed, but fears took hold. Scientists then carried out hundreds of studies, and found no link. Still, fears persisted. Countries removed the preservative but experienced no reduction in autism—yet fears grew. A British study claimed a connection between the onset of autism in eight children and the timing of their vaccinations for measles, mumps, and rubella. That paper was retracted due to findings of fraud: the lead author had falsified and misrepresented the data on the children. Repeated efforts to confirm the findings were unsuccessful. Nonetheless, vaccine rates plunged, leading to outbreaks of measles and mumps that, last year, sickened tens of thousands of children across the U.S., Canada, and Europe, and resulted in deaths.

People are prone to resist scientific claims when they clash with intuitive beliefs. They don’t see measles or mumps around anymore. They do see children with autism. And they see a mom who says, “My child was perfectly fine until he got a vaccine and became autistic.”

Now, you can tell them that correlation is not causation. You can say that children get a vaccine every two to three months for the first couple years of their life, so the onset of any illness is bound to follow vaccination for many kids. You can say that the science shows no connection. But once an idea has got embedded and become widespread, it becomes very difficult to dig it out of people’s brains—especially when they do not trust scientific authorities. And we are experiencing a significant decline in trust in scientific authorities.


5 years old, and still relevant.

Tuesday, November 28, 2017

Trusting big health data

Angela Villanueva
Baylor College of Medicine Blogs
Originally posted November 10, 2017

Here is an excerpt:

Potentially exacerbating this mistrust is a sense of loss of privacy and absence of control over information describing us and our habits. Given the extent of current “everyday” data collection and sharing for marketing and other purposes, this lack of trust is not unreasonable.

Health information sharing makes many people uneasy, particularly because of the potential harms such as insurance discrimination or stigmatization. Data breaches like the recent Equifax hack may add to these concerns and affect people’s willingness to share their health data.

But it is critical to encourage members of all groups to participate in big data initiatives focused on health in order for all to benefit from the resulting discoveries. My colleagues and I recently published an article detailing eight guiding principles for successful data sharing; building trust is one of them.

Here is the article.

Friday, June 26, 2015

Have You Ever Been Wrong?

By Peter Wehner
Commentary
Originally posted June 6, 2015

Here is an excerpt:

“Thus,” Mr. Mehlman writes, “policy positions were not driving partisanship, but rather partisanship was driving policy positions. Voters took whichever position was ascribed to their party, irrespective of the specific policies that position entailed.”

So what explains this? Some of it probably has to do with deference. Many people don’t follow public policy issues very closely — but they do know whose team they’re on. And so if their team endorses a particular policy, they’re strongly inclined to as well. They assume the position merits support based on who (and who does not) supports it.

The flip side of this is mistrust. If you’re a Democrat and you are told about the details of a Republican plan, you might automatically assume it’s a bad one (the same goes for how a Republican would receive a Democratic plan). If a party you despise holds a view on a certain issue, your reflex will be to hold that opposite view.

The entire article is here.