Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Self-Deception. Show all posts
Showing posts with label Self-Deception. Show all posts

Saturday, May 22, 2021

A normative account of self-deception, overconfidence, and paranoia

Rossi-Goldthorpe, R., Leong, et al.
(2021, April 12).
https://doi.org/10.31234/osf.io/9fkb5

Abstract

Self-deception, paranoia, and overconfidence involve misbeliefs about self, others, and world. They are often considered mistaken. Here we explore whether they might be adaptive, and further, whether they might be explicable in normative Bayesian terms. We administered a difficult perceptual judgment task with and without social influence (suggestions from a cooperating or competing partner). Crucially, the social influence was uninformative. We found that participants heeded the suggestions most under the most uncertain conditions and that they did so with high confidence, particularly if they were more paranoid. Model fitting to participant behavior revealed that their prior beliefs changed depending on whether the partner was a collaborator or competitor, however, those beliefs did not differ as a function of paranoia. Instead, paranoia, self-deception, and overconfidence were associated with participants’ perceived instability of their own performance. These data are consistent with the idea that self-deception, paranoia, and overconfidence flourish under uncertainty, and have their roots in low self-esteem, rather than excessive social concern. The normative model suggests that spurious beliefs can have value – self-deception is irrational yet can facilitate optimal behavior. This occurs even at the expense of monetary rewards, perhaps explaining why self-deception and paranoia contribute to costly decisions which can spark financial crashes and costly wars.

Saturday, March 13, 2021

The Dynamics of Motivated Beliefs

Zimmermann, Florian. 2020.
American Economic Review, 110 (2): 337-61.

Abstract
A key question in the literature on motivated reasoning and self-deception is how motivated beliefs are sustained in the presence of feedback. In this paper, we explore dynamic motivated belief patterns after feedback. We establish that positive feedback has a persistent effect on beliefs. Negative feedback, instead, influences beliefs in the short run, but this effect fades over time. We investigate the mechanisms of this dynamic pattern, and provide evidence for an asymmetry in the recall of feedback. Finally, we establish that, in line with theoretical accounts, incentives for belief accuracy mitigate the role of motivated reasoning.

From the Discussion

In light of the finding that negative feedback has only limited effects on beliefs in the long run, the question arises as to whether people should become entirely delusional about themselves over time. Note that results from the incentive treatments highlight that incentives for recall accuracy bound the degree of self-deception and thereby possibly prevent motivated agents from becoming entirely delusional. Further note that there exists another rather mechanical counterforce, which is that the perception of feedback likely changes as people become more confident. In terms of the experiment, if a subject believes that the chances of ranking in the upper half are mediocre, then that subject will likely perceive two comparisons out of three as positive feedback. If, instead, the same subject is almost certain they rank in the upper half, then that subject will likely perceive the same feedback as rather negative. Note that this “perception effect” is reflected in the Bayesian definition of feedback that we report as a robustness check in the Appendix of the paper. An immediate consequence of this change in perception is that the more confident an agent becomes, the more likely it is that they will obtain negative feedback. Unless an agent does not incorporate negative feedback at all, this should act as a force that bounds people’s delusions.

Friday, August 21, 2020

Religious Overclaiming and Support for Religious Aggression

Jones, D. N., Neria, A. L., et al.
Social Psychological and Personality Science.
https://doi.org/10.1177/1948550620912880

Abstract

Agentic self-enhancement consists of self-protective and self-advancing tendencies that can lead to aggression, especially when challenged. Because self-enhancers often endorse aggression to defend or enhance the self-concept, religious self-enhancement should lead to endorsing aggression to defend or enhance one’s religion. We recruited three samples (N = 969) from Mechanical Turk (n = 409), Iran (n = 351), and the U.S.–Mexico border region (n = 209). We found that religious (but not secular) self-enhancement in the form of religious overclaiming predicted support for, and willingness to engage in, religious aggression. In contrast, accuracy in religious knowledge had mostly negative associations with aggression-relevant outcomes. These results emerged across two separate religions (Christianity and Islam) and across three different cultures (the United States, Iran, and the U.S.–Mexico border region). Thus, religious overclaiming is a promising new direction for studying support for religious aggression and identifying those who may become aggressive in the name of God.

Conclusion

In sum, individuals who overclaimed religious knowledge (i.e., claim to know fictional religious concepts) supported religious aggression and were more willing to engage in religious aggression. This finding did not emerge for secular overclaiming, nor was it explained through other measures of group aggression. Further, accurate religious and secular knowledge mostly correlated with peaceful tendencies. These results emerged across three studies within different cultures (the United States, Iran, and the U.S.–Mexico border region) and religions (Islam and Christianity). In sum, the present findings have promise for future research on identifying and mitigating factors related to supporting religious aggression.

From a PsyPost interview:

“Overconfidence in what you think God supports or what scripture says is toxic. Thus, humility is a critical feature that is needed to bring out the best and most benevolent aspects of religion,” Jones told PsyPost.

Thursday, February 27, 2020

Liar, Liar, Liar

S. Vedantam, M. Penmann, & T. Boyle
Hidden Brain - NPR.org
Originally posted 17 Feb 20

When we think about dishonesty, we mostly think about the big stuff.

We see big scandals, big lies, and we think to ourselves, I could never do that. We think we're fundamentally different from Bernie Madoff or Tiger Woods.

But behind big lies are a series of small deceptions. Dan Ariely, a professor of psychology and behavioral economics at Duke University, writes about this in his book The Honest Truth about Dishonesty.

"One of the frightening conclusions we have is that what separates honest people from not-honest people is not necessarily character, it's opportunity," he said.

These small lies are quite common. When we lie, it's not always a conscious or rational choice. We want to lie and we want to benefit from our lying, but we want to be able to look in the mirror and see ourselves as good, honest people. We might go a little too fast on the highway, or pocket extra change at a gas station, but we're still mostly honest ... right?

That's why Ariely describes honesty as something of a state of mind. He thinks the IRS should have people sign a pledge committing to be honest when they start working on their taxes, not when they're done. Setting the stage for honesty is more effective than asking someone after the fact whether or not they lied.

The info is here.

There is a 30 minute audio file worth listening.

Wednesday, February 26, 2020

Zombie Ethics: Don’t Keep These Wrong Ideas About Ethical Leadership Alive

Bruce Weinstein
Forbes.com
Originally poste 18 Feb 20

Here is an excerpt:

Zombie Myth #1: There are no right and wrong answers in ethics

A simple thought experiment should permanently dispel this myth. Think about a time when you were disciplined or punished for something you firmly believed was unfair. Perhaps you were accused at work of doing something you didn’t do. Your supervisor Mike warned you not to do it again, even though you had plenty of evidence that you were innocent. Even Mike didn’t fire you, your good reputation has been sullied for no good reason.

Suppose you tell your colleague Janice this story, and she responds, “Well, to you Mike’s response was unfair, but from Mike’s point of view, it was absolutely fair.” What would you say to Janice?

A. “You’re right. There are no right or wrong answers in ethics.”

B. “No, Janice. Mike didn’t have a different point of view. He had a mistaken point of view. There are facts at hand, and Mike refused to consider them.”

Perhaps you believed myth #1 before this incident occurred. Now that you’ve been on the receiving end of a true injustice, you see this myth for what it really is: a zombie idea that needs to go to its grave permanently.

Zombie myth #2: Ethics varies from culture to culture and place to place 

It’s tempting to treat this myth as true. For example, bribery is a widely accepted way to do business in many countries. At a speech I gave to commercial pilots, an audience member said that the high-level executives on a recent flight weren’t allowed disembark until someone “took care of” a customs official. Either they could give him some money under the table and gain entry into the country, or they could leave.

But just because a practice is widely accepted doesn’t mean it is acceptable. That’s why smart businesses prohibit engaging in unfair international business practices, even if it means losing clients.

The info is here.

Monday, December 17, 2018

Am I a Hypocrite? A Philosophical Self-Assessment

John Danaher
Philosophical Disquisitions
Originally published November 9, 2018

Here are two excerpts:

The common view among philosophers is that hypocrisy is a moral failing. Indeed, it is often viewed as one of the worst moral failings. Why is this? Christine McKinnon’s article ‘Hypocrisy, with a Note on Integrity’ provides a good, clear defence of this view. The article itself is a classic exercise in analytical philosophical psychology. It tries to clarify the structure of hypocrisy and explain why we should take it so seriously. It does so by arguing that there are certain behaviours, desires and dispositions that are the hallmark of the hypocrite and that these behaviours, desires and dispositions undermine our system of social norms.

McKinnon makes this case by considering some paradigmatic instances of hypocrisy, and identifying the necessary and sufficient conditions that allow us to label these as instances of hypocrisy. My opening example of my email behaviour probably fits this paradigmatic mode — despite my protestations to the contrary. A better example, however, might be religious hypocrisy. There have been many well-documented historical cases of this, but let’s not focus on these. Let’s instead imagine a case that closely parallels these historical examples. Suppose there is a devout fundamentalist Christian preacher. He regularly preaches about the evils of homosexuality and secularism and professes to be heterosexual and devout. He calls upon parents to disown their homosexual children or to subject them to ‘conversion therapy’. Then, one day, this preacher is discovered to himself be a homosexual. Not just that, it turns out he has a long-term male partner that he has kept hidden from the public for over 20 years, and that they were recently married in a non-religious humanist ceremony.

(cut)

In other words, what I refer to as my own hypocrisy seems to involve a good deal of self-deception and self-manipulation, not (just) the manipulation of others. That’s why I was relieved to read Michael Statman’s article on ‘Hypocrisy and Self-Deception’. Statman wants to get away from the idea of the hypocrite as moral cartoon character. Real people are way more interesting than that. As he sees it, the morally vicious form of hypocrisy that is the focus of McKinnon’s ire tends to overlap with and blur into self-deception much more frequently than she allows. The two things are not strongly dichotomous. Indeed, people can slide back and forth between them with relative ease: the self-deceived can slide into hypocrisy and the hypocrite can slide into self-deception.

Although I am attracted to this view, Statman points out that it is a tough sell. 

Wednesday, December 13, 2017

Authenticity and Modernity

Andrew Bowie
iainews.iai.tv
Originally published November 6, 2017

Here are two excerpts:

As soon as there is a division in the self, of the kind generated by seeking self-knowledge, attributes like authenticity become a problem. The idea of anyone claiming ‘I am an authentic person’ involves a kind of self-observation that destroys what it seeks to affirm. This situation poses important questions about knowledge. If authenticity is destroyed by the subject thinking it knows that it is authentic, there seem to be ways of being which may be valuable because they transcend our ability to know them. As we shall see in a moment, this idea may help explain why art takes on new significance in modernity.

Despite these difficulties, the notion of authenticity has not disappeared from social discourse, which suggests it answers to a need to articulate something, even as that articulation seems to negate it. The problem with the notion as applied to individuals lies, then, in modern conflicts about the nature of the subject, where Marx, Nietzsche, Freud, and many others, put in question, in the manner already suggested by Schelling, the extent to which people can be transparent to themselves. Is what I am doing a true expression of myself, or is it the result of social conditioning, self-deception, the unconscious?

(cut)

The early uses of ‘sincere’ and ‘authentic’ had applied both to objects and people, but the moralising of the terms in the wake of the new senses of the self/subject that emerge in the modern era meant the terms came to apply predominantly to assessments of people. The more recent application of ‘authentic’ to watches, iPhones, trainers, etc., thus to objects which rely not least on their status as ‘brands’, can therefore be read as part of what Georg Lukács termed ‘reification’. Relations to objects can start to distort relations between people, giving the value of the ‘brand’ object primacy over that of other subjects. The figures here may be open to question, but the phenomenon seems to be real. The point is that this particular kind of violent theft is linked to the way objects are promoted as ‘authentic’ in the market, rather than just to either their monetary- or use-value.

The article is here.

Wednesday, July 26, 2017

Everybody lies: how Google search reveals our darkest secrets

Seth Stephens-Davidowitz
The Guardian
Originally published July 9, 2017

Everybody lies. People lie about how many drinks they had on the way home. They lie about how often they go to the gym, how much those new shoes cost, whether they read that book. They call in sick when they’re not. They say they’ll be in touch when they won’t. They say it’s not about you when it is. They say they love you when they don’t. They say they’re happy while in the dumps. They say they like women when they really like men. People lie to friends. They lie to bosses. They lie to kids. They lie to parents. They lie to doctors. They lie to husbands. They lie to wives. They lie to themselves. And they damn sure lie to surveys. Here’s my brief survey for you:

Have you ever cheated in an exam?

Have you ever fantasised about killing someone?

Were you tempted to lie?

Many people underreport embarrassing behaviours and thoughts on surveys. They want to look good, even though most surveys are anonymous. This is called social desirability bias. An important paper in 1950 provided powerful evidence of how surveys can fall victim to such bias. Researchers collected data, from official sources, on the residents of Denver: what percentage of them voted, gave to charity, and owned a library card. They then surveyed the residents to see if the percentages would match. The results were, at the time, shocking. What the residents reported to the surveys was very different from the data the researchers had gathered. Even though nobody gave their names, people, in large numbers, exaggerated their voter registration status, voting behaviour, and charitable giving.

The article is here.

Wednesday, April 26, 2017

Living a lie: We deceive ourselves to better deceive others

Matthew Hutson
Scientific American
Originally posted April 8, 2017

People mislead themselves all day long. We tell ourselves we’re smarter and better looking than our friends, that our political party can do no wrong, that we’re too busy to help a colleague. In 1976, in the foreword to Richard Dawkins’s “The Selfish Gene,” the biologist Robert Trivers floated a novel explanation for such self-serving biases: We dupe ourselves in order to deceive others, creating social advantage. Now after four decades Trivers and his colleagues have published the first research supporting his idea.

Psychologists have identified several ways of fooling ourselves: biased information-gathering, biased reasoning and biased recollections. The new work, forthcoming in the Journal of Economic Psychology, focuses on the first — the way we seek information that supports what we want to believe and avoid that which does not.

The article is here.

Tuesday, July 5, 2016

How scientists fool themselves – and how they can stop

Regina Nuzzo
Nature 526, 182–185 (08 October 2015)
doi:10.1038/526182a

Here is an excerpt:

This is the big problem in science that no one is talking about: even an honest person is a master of self-deception. Our brains evolved long ago on the African savannah, where jumping to plausible conclusions about the location of ripe fruit or the presence of a predator was a matter of survival. But a smart strategy for evading lions does not necessarily translate well to a modern laboratory, where tenure may be riding on the analysis of terabytes of multidimensional data. In today's environment, our talent for jumping to conclusions makes it all too easy to find false patterns in randomness, to ignore alternative explanations for a result or to accept 'reasonable' outcomes without question — that is, to ceaselessly lead ourselves astray without realizing it.

Failure to understand our own biases has helped to create a crisis of confidence about the reproducibility of published results, says statistician John Ioannidis, co-director of the Meta-Research Innovation Center at Stanford University in Palo Alto, California. The issue goes well beyond cases of fraud. Earlier this year, a large project that attempted to replicate 100 psychology studies managed to reproduce only slightly more than one-third. In 2012, researchers at biotechnology firm Amgen in Thousand Oaks, California, reported that they could replicate only 6 out of 53 landmark studies in oncology and haematology. And in 2009, Ioannidis and his colleagues described how they had been able to fully reproduce only 2 out of 18 microarray-based gene-expression studies.

The article is here.

Editor's note: These biases also apply to clinicians who use research or their own theories about how and why psychotherapy works.

Tuesday, February 2, 2016

What Makes Us Cheat? Experiment 2

by Simon Oxenham
BigThink
Originally published January 13, 2016

Dan Ariely, the psychologist who popularised behavioral economics, has made a fascinating documentary exploring what makes us dishonest. I’ve just finished watching it and it’s something of a masterpiece of psychological storytelling, delving deep into contemporary tales of dishonesty, and supporting its narrative with cunningly designed experiments that have been neatly reconstructed for the film camera.

Self-Deception



The article is here.

Monday, August 31, 2015

The What and Why of Self-Deception

Zoë Chance and Michael I. Norton
Current Opinion in Psychology
Available online 3 August 2015

Scholars from many disciplines have investigated self-deception, but both defining self-deception and establishing its possible benefits have been a matter of heated debate – a debate impoverished by a relative lack of empirical research. Drawing on recent research, we first classify three distinct definitions of self-deception, ranging from a view that self-deception is synonymous with positive illusions to a more stringent view that self-deception requires the presence of simultaneous conflicting beliefs. We then review recent research on the possible benefits of self-deception, identifying three adaptive functions: deceiving others, social status, and psychological benefits. We suggest potential directions for future research.

The nature and definition of self-deception remains open to debate. Philosophers have questioned whether – and how – self-deception is possible; evolutionary theorists have conjectured that self-deception may – or must – be adaptive. Until recently, there was little evidence for either the existence or processes of self-deception; indeed, Robert Trivers wrote that research on self-deception is still in its infancy. In recent years, however, empirical research on self-deception has been gaining traction in social psychology and economics, providing much-needed evidence and shedding light on the psychology of self-deception. We first classify competing definitions of self-deception, then review recent research supporting three distinct advantages of self-deception: improved success in deceiving others, social status, and psychological benefits.

The entire article is here.

Note to Psychologists: Psychologists engage in self-deception in psychotherapy.  Psychologists typically judge psychotherapy sessions as having been more beneficial than patients.  Self-deception may lead to clinical missteps and errors in judgment, both clinical and ethical.

Friday, April 3, 2015

Ethical Breakdowns

Max H. Bazerman and Ann E. Tenbrunsel
Harvard Business Review
Originally published in April 2011

Here is an excerpt:

Motivated Blindness

It’s well documented that people see what they want to see and easily miss contradictory information when it’s in their interest to remain ignorant—a psychological phenomenon known as motivated blindness. This bias applies dramatically with respect to unethical behavior. At Ford the senior-most executives involved in the decision to rush the flawed Pinto into production not only seemed unable to clearly see the ethical dimensions of their own decision but failed to recognize the unethical behavior of the subordinates who implemented it.

Let’s return to the 2008 financial collapse, in which motivated blindness contributed to some bad decision making. The “independent” credit rating agencies that famously gave AAA ratings to collateralized mortgage securities of demonstrably low quality helped build a house of cards that ultimately came crashing down, driving a wave of foreclosures that pushed thousands of people out of their homes. Why did the agencies vouch for those risky securities?

Part of the answer lies in powerful conflicts of interest that helped blind them to their own unethical behavior and that of the companies they rated. The agencies’ purpose is to provide stakeholders with an objective determination of the creditworthiness of financial institutions and the debt instruments they sell.

Monday, March 10, 2014

The Lies That Doctors and Patients Tell

By Sandeep Jauhar
The New York Times
Originally published February 20, 2014

Here is an excerpt:


Physicians sometimes deceive, too. We don’t always reveal when we make mistakes. Too often we order unnecessary tests, to bolster revenue or to protect against lawsuits. We sometimes mislead patients that our therapies have more value, more evidence behind them, than they actually do — whether it was placebo injections from my grandfather’s era, for example, or much of the spinal surgery or angioplasty that’s done today. 

Perhaps the most powerful deceptions in medicine are the ones we direct at ourselves — at our patients’ expense. Many physicians still espouse the patriotic (but deeply misconceived) notion that the American medical system is the best in the world. We deny the sickness in our system, and the role we as a profession have played in creating that sickness. We obsessively push ourselves to do more and more tests, scans and treatments for reasons that we sometimes hide from ourselves. 

The entire article is here.