Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Irrational Beliefs. Show all posts
Showing posts with label Irrational Beliefs. Show all posts

Thursday, July 14, 2022

What nudge theory got wrong

Tim Harford
The Financial Times
Originally posted 

Here is an excerpt:

Chater and Loewenstein argue that behavioural scientists naturally fall into the habit of seeing problems in the same way. Why don’t people have enough retirement savings? Because they are impatient and find it hard to save rather than spend. Why are so many greenhouse gases being emitted? Because it’s complex and tedious to switch to a green electricity tariff. If your problem is basically that fallible individuals are making bad choices, behavioural science is an excellent solution.

If, however, the real problem is not individual but systemic, then nudges are at best limited, and at worst, a harmful diversion. Historians such as Finis Dunaway now argue that the Crying Indian campaign was a deliberate attempt by corporate interests to change the subject. Is behavioural public policy, accidentally or deliberately, a similar distraction?

A look at climate change policy suggests it might be. Behavioural scientists themselves are clear enough that nudging is no real substitute for a carbon price — Thaler and Sunstein say as much in Nudge. Politicians, by contrast, have preferred to bypass the carbon price and move straight to the pain-free nudging.

Nudge enthusiast David Cameron, in a speech given shortly before he became prime minister, declared that “the best way to get someone to cut their electricity bill” was to cleverly reformat the bill itself. This is politics as the art of avoiding difficult decisions. No behavioural scientist would suggest that it was close to sufficient. Yet they must be careful not to become enablers of the One Weird Trick approach to making policy.

-------

Behavioural science has a laudable focus on rigorous evidence, yet even this can backfire. It is much easier to produce a quick randomised trial of bill reformatting than it is to evaluate anything systemic. These small quick wins are only worth having if they lead us towards, rather than away from, more difficult victories.

Another problem is that empirically tested, behaviourally rigorous bad policy can be bad policy nonetheless. For example, it has become fashionable to argue that people should be placed on an organ donor registry by default, because this dramatically expands the number of people registered as donors. But, as Thaler and Sunstein themselves keep having to explain, this is a bad idea. Most organ donation happens only after consultation with a grieving family — and default-bloated donor registries do not help families work out what their loved one might have wanted.


Monday, March 22, 2021

The Mistrust of Science

Atul Gawande
The New Yorker
Originally posted 01 June 2016

Here is an excerpt:

The scientific orientation has proved immensely powerful. It has allowed us to nearly double our lifespan during the past century, to increase our global abundance, and to deepen our understanding of the nature of the universe. Yet scientific knowledge is not necessarily trusted. Partly, that’s because it is incomplete. But even where the knowledge provided by science is overwhelming, people often resist it—sometimes outright deny it. Many people continue to believe, for instance, despite massive evidence to the contrary, that childhood vaccines cause autism (they do not); that people are safer owning a gun (they are not); that genetically modified crops are harmful (on balance, they have been beneficial); that climate change is not happening (it is).

Vaccine fears, for example, have persisted despite decades of research showing them to be unfounded. Some twenty-five years ago, a statistical analysis suggested a possible association between autism and thimerosal, a preservative used in vaccines to prevent bacterial contamination. The analysis turned out to be flawed, but fears took hold. Scientists then carried out hundreds of studies, and found no link. Still, fears persisted. Countries removed the preservative but experienced no reduction in autism—yet fears grew. A British study claimed a connection between the onset of autism in eight children and the timing of their vaccinations for measles, mumps, and rubella. That paper was retracted due to findings of fraud: the lead author had falsified and misrepresented the data on the children. Repeated efforts to confirm the findings were unsuccessful. Nonetheless, vaccine rates plunged, leading to outbreaks of measles and mumps that, last year, sickened tens of thousands of children across the U.S., Canada, and Europe, and resulted in deaths.

People are prone to resist scientific claims when they clash with intuitive beliefs. They don’t see measles or mumps around anymore. They do see children with autism. And they see a mom who says, “My child was perfectly fine until he got a vaccine and became autistic.”

Now, you can tell them that correlation is not causation. You can say that children get a vaccine every two to three months for the first couple years of their life, so the onset of any illness is bound to follow vaccination for many kids. You can say that the science shows no connection. But once an idea has got embedded and become widespread, it becomes very difficult to dig it out of people’s brains—especially when they do not trust scientific authorities. And we are experiencing a significant decline in trust in scientific authorities.


5 years old, and still relevant.

Sunday, January 13, 2019

The bad news on human nature, in 10 findings from psychology

Christian Jarrett
aeon.co
Originally published 

Here is an excerpt:

We are vain and overconfident. Our irrationality and dogmatism might not be so bad were they married to some humility and self-insight, but most of us walk about with inflated views of our abilities and qualities, such as our driving skills, intelligence and attractiveness – a phenomenon that’s been dubbed the Lake Wobegon Effect after the fictional town where ‘all the women are strong, all the men are good-looking, and all the children are above average’. Ironically, the least skilled among us are the most prone to overconfidence (the so-called Dunning-Kruger effect). This vain self-enhancement seems to be most extreme and irrational in the case of our morality, such as in how principled and fair we think we are. In fact, even jailed criminals think they are kinder, more trustworthy and honest than the average member of the public.

We are moral hypocrites. It pays to be wary of those who are the quickest and loudest in condemning the moral failings of others – the chances are that moral preachers are as guilty themselves, but take a far lighter view of their own transgressions. In one study, researchers found that people rated the exact same selfish behaviour (giving themselves the quicker and easier of two experimental tasks on offer) as being far less fair when perpetuated by others. Similarly, there is a long-studied phenomenon known as actor-observer asymmetry, which in part describes our tendency to attribute other people’s bad deeds, such as our partner’s infidelities, to their character, while attributing the same deeds performed by ourselves to the situation at hand. These self-serving double standards could even explain the common feeling that incivility is on the increase – recent research shows that we view the same acts of rudeness far more harshly when they are committed by strangers than by our friends or ourselves.


Tuesday, March 20, 2018

Why Partisanship Is Such a Worthy Foe of Objective Truth

Charlotte Hu
Discover Magazine
Originally published February 20, 2018

Here is an excerpt:

Take, for example, an experiment that demonstrated party affiliation affected people’s perception of a protest video. When participants felt the video depicted liberally minded protesters, Republicans were more in favor of police intervention than Democrats. The opposite emerged when participants thought the video showed a conservative protest. The visual information was identical, but people drew vastly different conclusions that were shaded by their political group affiliation.

“People are more likely to behave in and experience emotions in ways that are congruent with the activated social identity,” says Bavel. In other words, people will go along with the group, even if the ideas oppose their own ideologies—belonging may have more value than facts.

The situation is extenuated by social media, which creates echo chambers on both the left and the right. In these concentric social networks, the same news articles are circulated, validating the beliefs of the group and strengthening their identity association with the group.

The article is here.

Friday, November 17, 2017

The Illusion of Moral Superiority

Ben M. Tappin and Ryan T. McKay
Social Psychological and Personality Science
Volume: 8 issue: 6, page(s): 623-631
Issue published: August 1, 2017 

Abstract

Most people strongly believe they are just, virtuous, and moral; yet regard the average person as distinctly less so. This invites accusations of irrationality in moral judgment and perception—but direct evidence of irrationality is absent. Here, we quantify this irrationality and compare it against the irrationality in other domains of positive self-evaluation. Participants (N = 270) judged themselves and the average person on traits reflecting the core dimensions of social perception: morality, agency, and sociability. Adapting new methods, we reveal that virtually all individuals irrationally inflated their moral qualities, and the absolute and relative magnitude of this irrationality was greater than that in the other domains of positive self-evaluation. Inconsistent with prevailing theories of overly positive self-belief, irrational moral superiority was not associated with self-esteem. Taken together, these findings suggest that moral superiority is a uniquely strong and prevalent form of “positive illusion,” but the underlying function remains unknown.

The article is here.

Thursday, November 9, 2017

Morality and Machines

Robert Fry
Prospect
Originally published October 23, 2017

Here is an excerpt:

It is axiomatic that robots are more mechanically efficient than humans; equally they are not burdened with a sense of self-preservation, nor is their judgment clouded by fear or hysteria. But it is that very human fallibility that requires the intervention of the defining human characteristic—a moral sense that separates right from wrong—and explains why the ethical implications of the autonomous battlefield are so much more contentious than the physical consequences. Indeed, an open letter in 2015 seeking to separate AI from military application included the signatures of such luminaries as Elon Musk, Steve Wozniak, Stephen Hawking and Noam Chomsky. For the first time, therefore, human agency may be necessary on the battlefield not to take the vital tactical decisions but to weigh the vital moral ones.

So, who will accept these new responsibilities and how will they be prepared for the task? The first point to make is that none of this is an immediate prospect and it may be that AI becomes such a ubiquitous and beneficial feature of other fields of human endeavour that we will no longer fear its application in warfare. It may also be that morality will co-evolve with technology. Either way, the traditional military skills of physical stamina and resilience will be of little use when machines will have an infinite capacity for physical endurance. Nor will the quintessential commander’s skill of judging tactical advantage have much value when cognitive computing will instantaneously integrate sensor information. The key human input will be to make the judgments that link moral responsibility to legal consequence.

The article is here.

Friday, April 28, 2017

How rational is our rationality?

Interview by Richard Marshall
3 AM Magazine
Originally posted March 18, 2017

Here is an excerpt:

As I mentioned earlier, I think that the point of the study of rationality, and of normative epistemology more generally, is to help us figure out how to inquire, and the aim of inquiry, I believe, is to get at the truth. This means that there had better be a close connection between what we conclude about what’s rational to believe, and what we expect to be true. But it turns out to be very tricky to say what the nature of this connection is! For example, we know that sometimes evidence can mislead us, and so rational beliefs can be false. This means that there’s no guarantee that rational beliefs will be true. The goal of the paper is to get clear about why, and to what extent, it nonetheless makes sense to expect that rational beliefs will be more accurate than irrational ones. One reason this should be of interest to non-philosophers is that if it turns out that there isn’t some close connection between rationality and truth, then we should be much less critical of people with irrational beliefs. They may reasonably say: “Sure, my belief is irrational – but I care about the truth, and since my irrational belief is true, I won’t abandon it!” It seems like there’s something wrong with this stance, but to justify why it’s wrong, we need to get clear on the connection between a judgment about a belief’s rationality and a judgment about its truth. The account I give is difficult to summarize in just a few sentences, but I can say this much: what we say about the connection between what’s rational and what’s true will depend on whether we think it’s rational to doubt our own rationality. If it can be rational to doubt our own rationality (which I think is plausible), then the connection between rationality and truth is, in a sense, surprisingly tenuous.

The interview is here.

Sunday, February 19, 2017

Most People Consider Themselves to Be Morally Superior

By Cindi May
Scientific American
Originally published on January 31, 2017

Here are two excerpts:

This self-enhancement effect is most profound for moral characteristics. While we generally cast ourselves in a positive light relative to our peers, above all else we believe that we are more just, more trustworthy, more moral than others. This self-righteousness can be destructive because it reduces our willingness to cooperate or compromise, creates distance between ourselves and others, and can lead to intolerance or even violence. Feelings of moral superiority may play a role in political discord, social conflict, and even terrorism.

(cut)

So we believe ourselves to be more moral than others, and we make these judgments irrationally. What are the consequences? On the plus side, feelings of moral superiority could, in theory, protect our well-being. For example, there is danger in mistakenly believing that people are more trustworthy or loyal than they really are, and approaching others with moral skepticism may reduce the likelihood that we fall prey to a liar or a cheat. On the other hand, self-enhanced moral superiority could erode our own ethical behavior. Evidence from related studies suggests that self-perceptions of morality may “license” future immoral actions.

The article is here.

Saturday, January 7, 2017

The Irrationality Within Us

By Elly Vintiadis
Scientific American blog
Originally published on December 12, 2016

We like to think of ourselves as special because we can reason and we like to think that this ability expresses the essence of what it is to be human. In many ways this belief has formed our civilization; throughout history, we have used supposed differences in rationality to justify moral and political distinctions between different races, genders, and species, as well as between “healthy” and “diseased” individuals. Even to this day, people often associate mental disorder with irrationality and this has very real effects on people living with mental disorders.

But are we really that rational? And is rationality really what distinguishes people who live with mental illness from those who do not? It seems not. After decades of research, there is compelling evidence that we are not as rational as we think we are and that, rather than irrationality being the exception, it is part of who we normally are.

So what does it mean to be rational? We usually distinguish between two kinds of rationality.  Epistemic rationality, which is involved in acquiring true beliefs about the world and which sets the standard for what we ought to believe, and instrumental rationality which is involved in decision-making and behavior and is the standard for how we ought to act.

The article is here.

Tuesday, November 29, 2016

Everyone Thinks They’re More Moral Than Everyone Else

By Cari Romm
New York Magazine - The Science of Us
Originally posted November 15, 2016

There’s been a lot of talk over the past week about the “filter bubble” — the ideological cocoon that each of us inhabits, blinding us to opposing views. As my colleague Drake wrote the day after the election, the filter bubble is why so many people were so blindsided by Donald Trump’s win: They only saw, and only read, stories assuming that it wouldn’t happen.

Our filter bubbles are defined by the people and ideas we choose to surround ourselves with, but each of us also lives in a one-person bubble of sorts, viewing the world through our own distorted sense of self. The way we view ourselves in relation to others is a constant tug-of-war between two opposing forces: On one end of the spectrum is something called illusory superiority, a psychological quirk in which we tend to assume that we’re better than average — past research has found it to be true in people estimating their own driving skills, parents’ perceived ability to catch their kid in a lie, even cancer patients’ estimates of their own prognoses. And on the other end of the spectrum, there’s “social projection,” or the assumption that other people share your abilities or beliefs.

Sunday, November 27, 2016

Approach-Induced Biases in Human Information Sampling

Laurence T. Hunt and others
PLOS Biology
Published: November 10, 2016

Abstract

Information sampling is often biased towards seeking evidence that confirms one’s prior beliefs. Despite such biases being a pervasive feature of human behavior, their underlying causes remain unclear. Many accounts of these biases appeal to limitations of human hypothesis testing and cognition, de facto evoking notions of bounded rationality, but neglect more basic aspects of behavioral control. Here, we investigated a potential role for Pavlovian approach in biasing which information humans will choose to sample. We collected a large novel dataset from 32,445 human subjects, making over 3 million decisions, who played a gambling task designed to measure the latent causes and extent of information-sampling biases. We identified three novel approach-related biases, formalized by comparing subject behavior to a dynamic programming model of optimal information gathering. These biases reflected the amount of information sampled (“positive evidence approach”), the selection of which information to sample (“sampling the favorite”), and the interaction between information sampling and subsequent choices (“rejecting unsampled options”). The prevalence of all three biases was related to a Pavlovian approach-avoid parameter quantified within an entirely independent economic decision task. Our large dataset also revealed that individual differences in the amount of information gathered are a stable trait across multiple gameplays and can be related to demographic measures, including age and educational attainment. As well as revealing limitations in cognitive processing, our findings suggest information sampling biases reflect the expression of primitive, yet potentially ecologically adaptive, behavioral repertoires. One such behavior is sampling from options that will eventually be chosen, even when other sources of information are more pertinent for guiding future action.

The article is here.

Tuesday, November 8, 2016

The Illusion of Moral Superiority

Ben M. Tappin and Ryan T. McKay
Social Psychological and Personality Science
2016, 1-9

Abstract

Most people strongly believe they are just, virtuous, and moral; yet regard the average person as distinctly less so. This invites accusations of irrationality in moral judgment and perception—but direct evidence of irrationality is absent. Here, we quantify this irrationality and compare it against the irrationality in other domains of positive self-evaluation. Participants (N ¼ 270) judged themselves and the average person on traits reflecting the core dimensions of social perception: morality, agency, and sociability.  Adapting new methods, we reveal that virtually all individuals irrationally inflated their moral qualities, and the absolute and relative magnitude of this irrationality was greater than that in the other domains of positive self-evaluation. Inconsistent with prevailing theories of overly positive self-belief, irrational moral superiority was not associated with self-esteem. Taken together, these findings suggest that moral superiority is a uniquely strong and prevalent form of ‘‘positive illusion,’’ but the underlying function remains unknown.

The article is here.

Saturday, October 8, 2016

The Irrational Idea That Humans Are Mostly Irrational

Paul Bloom
The Atlantic
Originally posted September 16, 2016

Last summer I was at a moral psychology conference in Chile, listening to speaker after speaker discuss research into how people think about sexuality, crime, taxation, and other politically and socially fraught issues. The consensus was that human moral reasoning is a mess—irrational, contradictory, and incoherent.

And how could it be otherwise? The evolutionary psychologists in the room argued that our propensity to reason about right and wrong arises through social adaptations calibrated to enhance our survival and reproduction, not to arrive at consistent or objective truth. And according to the social psychologists, we are continually swayed by irrelevant factors, by gut feelings and unconscious motivations. As the primatologist Frans de Waal once put it, summing up the psychological consensus: “We celebrate rationality, but when push comes to shove we assign it little weight.”

I think that this is mistaken. Yes, our moral capacities are far from perfect. But—as I’ve argued elsewhere, including in my forthcoming book on empathy—we are often capable of objective moral reasoning. And so we can arrive at novel, sometimes uncomfortable, moral positions, as when men appreciate the wrongness of sexism or when people who really like the taste of meat decide that it’s better to go without.

The article is here.

Monday, September 19, 2016

The Neuroscience Behind Bad Decisions

By Emily Singer
Quanta Magazine
Originally posted August 23, 2016

Here is an excerpt:

Glimcher is using both the brain and behavior to try to explain our irrationality. He has combined results from studies like the candy bar experiment with neuroscience data — measurements of electrical activity in the brains of animals as they make decisions — to develop a theory of how we make decisions and why that can lead to mistakes.

Glimcher has been one of the driving forces in the still young field of neuroeconomics. His theory merges far-reaching research in brain activity, neuronal networks, fMRI and human behavior. “He’s famous for arguing that neuroscience and economics should be brought together,” said Nathaniel Daw, a neuroscientist at Princeton University. One of Glimcher’s most important contributions, Daw said, has been figuring out how to quantify abstract notions such as value and study them in the lab.

In a new working paper, Glimcher and his co-authors — Kenway Louie, also of NYU, and Ryan Webb of the University of Toronto — argue that their neuroscience-based model outperforms standard economic theory at explaining how people behave when faced with lots of choices. “The neural model, described in biology and tested in neurons, works well to describe something economists couldn’t explain,” Glimcher said.

The article is here.

Saturday, May 14, 2016

On the Source of Human Irrationality

Oaksford, Mike et al.
Trends in Cognitive Sciences , Volume 20 , Issue 5 , 336 - 344

Summary

Reasoning and decision making are error prone. This is often attributed to a fast, phylogenetically old System 1. It is striking, however, that perceptuo-motor decision making in humans and animals is rational. These results are consistent with perceptuo-motor strategies emerging in Bayesian brain theory that also appear in human data selection. People seem to have access, although limited, to unconscious generative models that can generalise to explain other verbal reasoning results. Error does not emerge predominantly from System 1, but rather seems to emerge from the later evolved System 2 that involves working memory and language. However language also sows the seeds of error correction by moving reasoning into the social domain. This reversal of roles suggests key areas of theoretical integration and new empirical directions.

Trends

System 1 is supposedly the main cause of human irrationality. However, recent work on animal decision making, human perceptuo-motor decision making, and logical intuitions shows that this phylogenetically older system is rational.

Bayesian brain theory has recently proposed perceptuo-motor strategies identical to strategies proposed in Bayesian approaches to conscious verbal reasoning, suggesting that similar generative models are available at both levels.

Recent approaches to conditional inference using causal Bayes nets confirm this account, which can also generalise to logical intuitions.

People have only imperfect access to System 1. Errors arise from inadequate interrogation of System 1, working memory limitations, and mis-description of our records of these interrogations. However, there is evidence that such errors may be corrected by moving reasoning to the social domain facilitated by language.

The article is here.

Saturday, July 19, 2014

SciCafe: The Evolution of Irrationality

SciCafe
Originally posted April 2014

Laurie Santos presented her research on the evolution of irrationality and insights from primates. Don't worry if you missed it: we have a video of her presentation, including clips of monkeys "shopping" for treats! Santos explores the roots of human irrationality by watching our primate relatives make decisions in "monkeynomics."


Tuesday, June 10, 2014

I Don't Want to Be Right

By Maria Konnikova
The New Yorker
Originally published May 19, 2013

Last month, Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years. They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds.

The result was dramatic: a whole lot of nothing. None of the interventions worked.

The entire article is here.

Friday, January 24, 2014

The Problem of Evil

Sally Haslanger
Professor of Philosophy
Massachusetts Institute of Technology

Sally discusses a classic argument that God does not exist, called 'The Problem of Evil'. Along the way, she distinguishes different ways in which people believe that God exists, and discusses what's bad about having contradictory beliefs.


Wednesday, December 25, 2013

Judge gives probation to teen who killed four in DWI crash citing 'affluenza'

By Jim Douglas
KHOU - Houston Texas
Originally posted December 10, 2013

Here is two excerpts:

Prior to sentencing, a psychologist called by the defense, Dr. G. Dick Miller,  testified that Couch's life could be salvaged with one to two years' treatment and no contact with his parents.

(cut)

Miller said Couch's parents gave him "freedoms no young person should have." He called Couch a product of "affluenza," where his family felt that wealth bought privilege and there was no rational link between behavior and consequences.

He said Couch got whatever he wanted. As an example,  Miller said Couch's parents gave no punishment after police ticketed the then-15-year-old when he was found in a parked pickup with a passed out, undressed 14-year-old girl.

The entire story is here.

Wednesday, December 18, 2013

Texas pair released after serving 21 years for 'satanic abuse'

Dan and Fran Keller, sentenced in 1991 for child sexual assault during US 'Satanic panic' era, released after district attorney conceded trial jury was probably swayed by faulty testimony

By Tom Dart
The Guardian
Originally posted December 5, 2013

Here are two excerpts:

The only physical evidence against the Kellers was the testimony of Dr. Michael Mouw, who examined the girl in the emergency room of a local hospital after the therapy session and said he found tears in her hymen that potentially indicated that she was molested.

Mouw signed an affidavit last January in which he affirms that he now realises his inexperience led him to a conclusion that "is not scientifically or medically valid, and that I was mistaken."

In an appeal filed on behalf of Fran Keller earlier this year, her lawyer, Keith Hampton, also argued that the state presented misleading evidence about the cemetery, relied on a false witness confession and the testimony of a "quack" satanic abuse "expert", and that suggestive interview techniques had encouraged the children to make "fantastical false statements".

(cut)

DeYoung said that suggestive and insistent interviewing strategies prompted children to make up stories and start to believe what they were telling the adults, and that the received wisdom was that children would not lie about such serious crimes. Media and parental pressure obliged the police to give credence even to risible allegations.

The entire story is here.

There is an interesting Geraldo Rivera special television episode, Exposing Satan's Underground from 1988, associated with this story found here on YouTube.  The entire episode is worth watching, if you are interested in the hysteria and panic of that time.  At the 1 hour and 15 minute mark, psychologists and psychiatrists report threats to their lives when treating survivors of ritualistic abuse.