Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Misinformation. Show all posts
Showing posts with label Misinformation. Show all posts

Wednesday, April 12, 2017

Why People Continue to Believe Objectively False Things

Amanda Taub and Brendan Nyhan
New York Times - The Upshot
Originally posted March 22, 2017

Here is an excerpt:

Even when myths are dispelled, their effects linger. The Boston College political scientist Emily Thorson conducted a series of studies showing that exposure to a news article containing a damaging allegation about a fictional political candidate caused people to rate the candidate more negatively even when the allegation was corrected and people believed it to be false.

There are ways to correct information more effectively. Adam Berinsky of M.I.T., for instance, found that a surprising co-partisan source (a Republican member of Congress) was the most effective in reducing belief in the “death panel” myth about the Affordable Care Act.

But in the wiretapping case, Republican lawmakers have neither supported Mr. Trump’s wiretap claims (which could risk their credibility) nor strenuously opposed them (which could prompt a partisan backlash). Instead, they have tried to shift attention to a different political narrative — one that suits the partisan divide by making Mr. Obama the villain of the piece. Rather than focusing on the wiretap allegation, they have sought to portray the House Intelligence Committee hearings on Russian interference in the election as an investigation into leaks of classified information.

The article is here.

Monday, April 3, 2017

Conviction, persuasion and manipulation: the ethical dimension of epistemic vigilance

Johannes Mahr
Cognition and Culture Institute Blog
Originally posted 10 March 2017

In today’s political climate moral outrage about (alleged) propaganda and manipulation of public opinion dominate our discourse. Charges of manipulative information provision have arguably become the most widely used tool to discredit one’s political opponent. Of course, one reason for why such charges have become so prominent is that the way we consume information through online media has made us more vulnerable than ever to such manipulation. Take a recent story published by The Guardian, which describes the strategy of information dissemination allegedly used by the British ‘Leave Campaign’:
“The strategy involved harvesting data from people’s Facebook and other social media profiles and then using machine learning to ‘spread’ through their networks. Wigmore admitted the technology and the level of information it gathered from people was ‘creepy’. He said the campaign used this information, combined with artificial intelligence, to decide who to target with highly individualised advertisements and had built a database of more than a million people.”
This might not just strike you as “creepy” but as simply unethical just as it did one commentator cited in the article who called these tactics “extremely disturbing and quite sinister”. Here, I want to investigate where this intuition comes from.

The blog post is here.

Wednesday, March 22, 2017

The Case of Dr. Oz: Ethics, Evidence, and Does Professional Self-Regulation Work?

Jon C. Tilburt, Megan Allyse, and Frederic W. Hafferty
AMA Journal of Ethics. February 2017, Volume 19, Number 2: 199-206.

Abstract

Dr. Mehmet Oz is widely known not just as a successful media personality donning the title “America’s Doctor®,” but, we suggest, also as a physician visibly out of step with his profession. A recent, unsuccessful attempt to censure Dr. Oz raises the issue of whether the medical profession can effectively self-regulate at all. It also raises concern that the medical profession’s self-regulation might be selectively activated, perhaps only when the subject of professional censure has achieved a level of public visibility. We argue here that the medical profession must look at itself with a healthy dose of self-doubt about whether it has sufficient knowledge of or handle on the less visible Dr. “Ozes” quietly operating under the profession’s presumptive endorsement.

The article is here.

Monday, March 13, 2017

Why Facts Don't Change Our Minds

Elizabeth Kolbert
The New Yorker
Originally published February 27, 2017

Here is an excerpt:

Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coƶperate. Coƶperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.

Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

The article is here.

Tuesday, February 2, 2016

The spreading of misinformation online

M. Del Vicarioa , A. Bessib , F. Zolloa , F. Petronic , A. Scalaa, G. Caldarellia, H. E. Stanley, and W. Quattrociocchia
Proceedings of the National Academy of Sciences

Abstract

The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web (WWW) also allows for the rapid dissemination of unsubstantiated rumors and conspiracy theories that often elicit rapid, large, but naive social responses such as the recent case of Jade Helm 15––where a simple military exercise turned out to be perceived as the beginning of a new civil war in the United States. In this work, we address the determinants governing misinformation spreading through a thorough quantitative analysis. In particular, we focus on how Facebook users consume information related to two distinct narratives: scientific and conspiracy news. We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.” Indeed, homogeneity appears to be the primary driver for the diffusion of contents and each echo chamber has its own cascade dynamics. Finally, we introduce a data-driven percolation model mimicking rumor spreading and we show that homogeneity and polarization are the main determinants for predicting cascades’ size.

The article is here.

Sunday, February 22, 2015

A fault in our design

We tend to think that technological progress is making us more resilient, but it might be making us more vulnerable

By Colin Dickey
Aeon Magazine
Originally published January 23, 2015

Here is an excerpt:

Freed from the constant worry of danger, we tend to forget that there ever was a danger in the first place. We’ve immunised ourselves from the fear of diseases that once plagued us, to the point where they’re now killing us once more. Fuelled by the viral spread of misinformation and paranoia, vaccine use has plummeted in parts of the Western world, leading to a resurgence in viruses. In the US, mortality rates for pertussis (whooping cough) dropped from 1,100 in 1950 to six in 1995, yet in the past decade outbreaks have once again spiked – more than 48,000 cases were reported in 2013, significantly outnumbering the 5137 cases that were reported back in 1995.

The entire article is here.