Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Lying. Show all posts
Showing posts with label Lying. Show all posts

Tuesday, March 5, 2024

You could lie to a health chatbot – but it might change how you perceive yourself

Dominic Wilkinson
The Conversation
Originally posted 8 FEB 24

Here is an excerpt:

The ethics of lying

There are different ways that we can think about the ethics of lying.

Lying can be bad because it causes harm to other people. Lies can be deeply hurtful to another person. They can cause someone to act on false information, or to be falsely reassured.

Sometimes, lies can harm because they undermine someone else’s trust in people more generally. But those reasons will often not apply to the chatbot.

Lies can wrong another person, even if they do not cause harm. If we willingly deceive another person, we potentially fail to respect their rational agency, or use them as a means to an end. But it is not clear that we can deceive or wrong a chatbot, since they don’t have a mind or ability to reason.

Lying can be bad for us because it undermines our credibility. Communication with other people is important. But when we knowingly make false utterances, we diminish the value, in other people’s eyes, of our testimony.

For the person who repeatedly expresses falsehoods, everything that they say then falls into question. This is part of the reason we care about lying and our social image. But unless our interactions with the chatbot are recorded and communicated (for example, to humans), our chatbot lies aren’t going to have that effect.

Lying is also bad for us because it can lead to others being untruthful to us in turn. (Why should people be honest with us if we won’t be honest with them?)

But again, that is unlikely to be a consequence of lying to a chatbot. On the contrary, this type of effect could be partly an incentive to lie to a chatbot, since people may be conscious of the reported tendency of ChatGPT and similar agents to confabulate.


Here is my summary:

The article discusses the potential consequences of lying to a health chatbot, even though it might seem tempting. It highlights a situation where someone frustrated with a wait for surgery considers exaggerating their symptoms to a chatbot screening them.

While lying might offer short-term benefits like quicker attention, the author argues it could have unintended consequences:

Impact on healthcare:
  • Inaccurate information can hinder proper diagnosis and treatment.
  • It contributes to an already strained healthcare system.
Self-perception:
  • Repeatedly lying, even to a machine, can erode honesty and integrity.
  • It reinforces unhealthy avoidance of seeking professional help.
The article encourages readers to be truthful with chatbots for better healthcare outcomes and self-awareness. It acknowledges the frustration with healthcare systems but emphasizes the importance of transparency for both individual and collective well-being.

Sunday, April 9, 2023

Clarence Thomas Has Reportedly Been Accepting Gifts From Republican Megadonor Harlan Crow For Decades—And Never Disclosed It

Alison Durkee
Forbes.com
Originally posted 6 APR 23

Supreme Court Justice Clarence Thomas has been accepting trips from Republican megadonor Harlan Crow for more than 20 years without disclosing them as required, ProPublica reports—including trips on private jets and yachts that could run afoul of the law—the latest in a series of ethical scandals the conservative justice has faced amid calls for justices to follow an ethics code.

Key Facts
  • Thomas has repeatedly used Crow’s private jet for travel and vacationed with him including on his superyacht and at Crow’s private resort in the Adirondacks, where guests stay for free, ProPublica reports, citing flight records, internal documents and interviews with Crow’s employees.
  • The justice has stayed at Crow’s resort “every summer for more than two decades,” according to ProPublica, and reportedly makes “regular use” of Crow’s private jet, including as recently as last year and for as short as a three-hour trip from Washington, D.C., to Connecticut in 2016.
  • While Supreme Court justices are not bound to the same code of ethics as lower federal court judges are, they do submit financial disclosures and are subject to laws that require disclosing gifts that are more than $415 in value, including any transportation that substitutes for commercial transport
  • Experts cited by ProPublica believe Thomas may have violated federal disclosure laws by not disclosing his yacht and jet travel, and that the stays at Crow’s resort may also have required disclosure because the resort is owned by Crow’s company rather than him personally.
  • Thomas’ stays at Crows’ resort also raise ethics concerns given the other guests Crow—a real estate magnate and Republican megadonor—has invited to the resort and on his yacht at the same time, which ProPublica reports include GOP donors, ​​executives at Verizon and PricewaterhouseCoopers, leaders from right-wing think tank American Enterprise Institute, Federalist Society leader Leonard Leo and Mark Paoletta, the general counsel for the Trump Administration’s Office of Management and Budget who now serves as Thomas’ wife’s attorney.

Saturday, September 10, 2022

Social norms and dishonesty across societies

Aycinena, D., et al.
PNAS, 119 (31), 2022.

Abstract

Social norms have long been recognized as an important factor in curtailing antisocial behavior, and stricter prosocial norms are commonly associated with increased prosocial behavior. In this study, we provide evidence that very strict prosocial norms can have a perverse negative relationship with prosocial behavior. In laboratory experiments conducted in 10 countries across 5 continents, we measured the level of honest behavior and elicited injunctive norms of honesty. We find that individuals who hold very strict norms (i.e., those who perceive a small lie to be as socially unacceptable as a large lie) are more likely to lie to the maximal extent possible. This finding is consistent with a simple behavioral rationale. If the perceived norm does not differentiate between the severity of a lie, lying to the full extent is optimal for a norm violator since it maximizes the financial gain, while the perceived costs of the norm violation are unchanged. We show that the relation between very strict prosocial norms and high levels of rule violations generalizes to civic norms related to common moral dilemmas, such as tax evasion, cheating on government benefits, and fare dodging on public transportation. Those with very strict attitudes toward civic norms are more likely to lie to the maximal extent possible. A similar relation holds across countries. Countries with a larger fraction of people with very strict attitudes toward civic norms have a higher society-level prevalence of rule violations.

Significance

Much of the research in the experimental and behavioral sciences finds that stronger prosocial norms lead to higher levels of prosocial behavior. Here, we show that very strict prosocial norms are negatively correlated with prosocial behavior. Using laboratory experiments on honesty, we demonstrate that individuals who hold very strict norms of honesty are more likely to lie to the maximal extent. Further, countries with a larger fraction of people with very strict civic norms have proportionally more societal-level rule violations. We show that our findings are consistent with a simple behavioral rationale. If perceived norms are so strict that they do not differentiate between small and large violations, then, conditional on a violation occurring, a large violation is individually optimal.


In essence, very strict social norms can backfire.  People can lie to the fullest extent with similar costs to minimal lying.

Wednesday, February 3, 2021

Research on Non-verbal Signs of Lies and Deceit: A Blind Alley

T. Brennen & S. Magnussen
Front. Psychol., 14 December 2020

Introduction

Research on the detection of lies and deceit has a prominent place in the field of psychology and law with a substantial research literature published in this field of inquiry during the last five to six decades (Vrij, 2000, 2008; Vrij et al., 2019). There are good reasons for this interest in lie detection. We are all everyday liars, some of us more prolific than others, we lie in personal and professional relationships (Serota et al., 2010; Halevy et al., 2014; Serota and Levine, 2015; Verigin et al., 2019), and lying in public by politicians and other public figures has a long and continuing history (Peters, 2015). However, despite the personal problems that serious everyday lies may cause and the human tragedies political lies may cause, it is lying in court that appears to have been the principal initial motivation for the scientific interest in lie detection.

Lying in court is a threat to fair trials and the rule of law. Lying witnesses may lead to the exoneration of guilty persons or to the conviction of innocent ones. In the US it is well-documented that innocent people have been convicted because witnesses were lying in court (Garrett, 2010, 2011; www.innocenceproject.com). In evaluating the reliability and the truthfulness of a testimony, the court considers other evidence presented to the court, the known facts about the case and the testimonies by other witnesses. Inconsistency with the physical evidence or the testimonies of other witnesses might indicate that the witness is untruthful, or it may simply reflect the fact that the witness has observed, interpreted, and later remembered the critical events incorrectly—normal human errors all too well known in the eyewitness literature (Loftus, 2005; Wells and Loftus, 2013; Howe and Knott, 2015).

(as it ends)

Is the rational course simply to drop this line of research? We believe it is. The creative studies carried out during the last few decades have been important in showing that psychological folklore, the ideas we share about behavioral signals of lies and deceit are not correct. This debunking function of science is extremely important. But we have now sufficient evidence that there are no specific non-verbal behavioral signals that accompany lying or deceitful behavior. We can safely recommend that courts disregard such behavioral signals when appraising the credibility of victims, witnesses, and suspected offenders. For psychology and law researchers it may be time to move on.

Monday, September 14, 2020

Trump lied about science

H. Holden Thorp
Science
Originally published 11 Sept 20

When President Donald Trump began talking to the public about coronavirus disease 2019 (COVID-19) in February and March, scientists were stunned at his seeming lack of understanding of the threat. We assumed that he either refused to listen to the White House briefings that must have been occurring or that he was being deliberately sheltered from information to create plausible deniability for federal inaction. Now, because famed Washington Post journalist Bob Woodward recorded him, we can hear Trump’s own voice saying that he understood precisely that severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) was deadly and spread through the air. As he was playing down the virus to the public, Trump was not confused or inadequately briefed: He flat-out lied, repeatedly, about science to the American people. These lies demoralized the scientific community and cost countless lives in the United States.

Over the years, this page has commented on the scientific foibles of U.S. presidents. Inadequate action on climate change and environmental degradation during both Republican and Democratic administrations have been criticized frequently. Editorials have bemoaned endorsements by presidents on teaching intelligent design, creationism, and other antiscience in public schools. These matters are still important. But now, a U.S. president has deliberately lied about science in a way that was imminently dangerous to human health and directly led to widespread deaths of Americans.

This may be the most shameful moment in the history of U.S. science policy.

In an interview with Woodward on 7 February 2020, Trump said he knew that COVID-19 was more lethal than the flu and that it spread through the air. “This is deadly stuff,” he said. But on 9 March, he tweeted that the “common flu” was worse than COVID-19, while economic advisor Larry Kudlow and presidential counselor Kellyanne Conway assured the public that the virus was contained. On 19 March, Trump told Woodward that he did not want to level with the American people about the danger of the virus. “I wanted to always play it down,” he said, “I still like playing it down.” Playing it down meant lying about the fact that he knew the country was in grave danger.

The info is here.

Tuesday, June 9, 2020

Intending to deceive versus deceiving intentionally in indifferent lies

Alex Wiegmann & Ronja Rutschmann
(2020) Philosophical Psychology,
DOI: 10.1080/09515089.2020.1761544

Abstract

Indifferent lies have been proposed as a counterexample to the claim that lying requires an intention to deceive. In indifferent lies, the speaker says something she believes to be false (in a truth-warranting context) but does not really care about whether the addressee believes what she says. Krstić (2019) argues that in such cases, the speaker deceives the addressee intentionally and, therefore, indifferent lies do not show that lying does not require an intention to deceive. While we agree that the speaker deceives the addressee intentionally, we resist Krstić’s conclusion by pointing out that there is a difference between deceiving intentionally and intending to deceive. To this aim, we presented 268 participants with a new variant of an indifferent lie and asked whether the speaker lied, whether she had an intention to deceive, and whether she deceived intentionally. Whereas the majority of participants considered the speaker to have deceived the addressee intentionally, most denied that the speaker had an intention to deceive the addressee. Hence, indifferent lies still challenge widely accepted definitions of lying.

The research is here.

Thursday, February 27, 2020

Liar, Liar, Liar

S. Vedantam, M. Penmann, & T. Boyle
Hidden Brain - NPR.org
Originally posted 17 Feb 20

When we think about dishonesty, we mostly think about the big stuff.

We see big scandals, big lies, and we think to ourselves, I could never do that. We think we're fundamentally different from Bernie Madoff or Tiger Woods.

But behind big lies are a series of small deceptions. Dan Ariely, a professor of psychology and behavioral economics at Duke University, writes about this in his book The Honest Truth about Dishonesty.

"One of the frightening conclusions we have is that what separates honest people from not-honest people is not necessarily character, it's opportunity," he said.

These small lies are quite common. When we lie, it's not always a conscious or rational choice. We want to lie and we want to benefit from our lying, but we want to be able to look in the mirror and see ourselves as good, honest people. We might go a little too fast on the highway, or pocket extra change at a gas station, but we're still mostly honest ... right?

That's why Ariely describes honesty as something of a state of mind. He thinks the IRS should have people sign a pledge committing to be honest when they start working on their taxes, not when they're done. Setting the stage for honesty is more effective than asking someone after the fact whether or not they lied.

The info is here.

There is a 30 minute audio file worth listening.

Tuesday, April 23, 2019

4 Ways Lying Becomes the Norm at a Company

Ron Carucci
Harvard Business Review
Originally published February 15, 2019

Many of the corporate scandals in the past several years — think Volkswagen or Wells Fargo — have been cases of wide-scale dishonesty. It’s hard to fathom how lying and deceit permeated these organizations. Some researchers point to group decision-making processes or psychological traps that snare leaders into justification of unethical choices. Certainly those factors are at play, but they largely explain dishonest behavior at an individual level and I wondered about systemic factors that might influence whether or not people in organizations distort or withhold the truth from one another.

This is what my team set out to understand through a 15-year longitudinal study. We analyzed 3,200 interviews that were conducted as part of 210 organizational assessments to see whether there were factors that predicted whether or not people inside a company will be honest. Our research yielded four factors — not individual character traits, but organizational issues — that played a role. The good news is that these factors are completely within a corporation’s control and improving them can make your company more honest, and help avert the reputation and financial disasters that dishonesty can lead to.

The stakes here are high. Accenture’s Competitive Agility Index — a 7,000-company, 20-industry analysis, for the first time tangibly quantified how a decline in stakeholder trust impacts a company’s financial performance. The analysis reveals more than half (54%) of companies on the index experienced a material drop in trust — from incidents such as product recalls, fraud, data breaches and c-suite missteps — which equates to a minimum of $180 billion in missed revenues. Worse, following a drop in trust, a company’s index score drops 2 points on average, negatively impacting revenue growth by 6% and EBITDA by 10% on average.

The info is here.

Friday, March 15, 2019

4 Ways Lying Becomes the Norm at a Company

Ron Carucci
Harvard Business Review
Originally posted February 15, 2019

Here is an excerpt:

Unjust accountability systems. When an organization’s processes for measuring employee contributions is perceived as unfair or unjust, we found it is 3.77 times more likely to have people withhold or distort information. We intentionally excluded compensation in our research, because incentive structures can sometimes play disproportionate roles in influencing behavior, and simply looked at how contribution was measured and evaluated through performance management systems, routine feedback processes, and cultural recognition. One interviewee captured a pervasive sentiment about how destructive these systems can be: “I don’t know why I work so hard. My boss doesn’t have a clue what I do. I fill out the appraisal forms at the end of the year, he signs them and sends them to HR. We pretend to have a discussion, and then we start over. It’s a rigged system.” Our study showed that when accountability processes are seen as unfair, people feel forced to embellish their accomplishments and hide, or make excuses for their shortfalls. That sets the stage for dishonest behavior. Research on organizational injustice shows a direct correlation between an employee’s sense of fairness and a conscious choice to sabotage the organization. And more recent research confirms that unfair comparison among employees leads directly to unethical behavior.

Fortunately, our statistical models show that even a 20% improvement in performance management consistency, as evidenced by employees belief that their contributions have been fairly assessed against known standards, can improve truth telling behavior by 12%.

The info is here.

Sunday, November 18, 2018

Bornstein claims Trump dictated the glowing health letter

Alex Marquardt and Lawrence Crook
CNN.com
Originally posted May 2, 2018

When Dr. Harold Bornstein described in hyperbolic prose then-candidate Donald Trump's health in 2015, the language he used was eerily similar to the style preferred by his patient.

It turns out the patient himself wrote it, according to Bornstein.

"He dictated that whole letter. I didn't write that letter," Bornstein told CNN on Tuesday. "I just made it up as I went along."

The admission is an about face from his answer more than two years when the letter was released and answers one of the lingering questions about the last presidential election. The letter thrust the eccentric Bornstein, with his shoulder-length hair and round eyeglasses, into public view.

"His physical strength and stamina are extraordinary," he crowed in the letter, which was released by Trump's campaign in December 2015. "If elected, Mr. Trump, I can state unequivocally, will be the healthiest individual ever elected to the presidency."

The missive didn't offer much medical evidence for those claims beyond citing a blood pressure of 110/65, described by Bornstein as "astonishingly excellent." It claimed Trump had lost 15 pounds over the preceding year. And it described his cardiovascular health as "excellent."

The info is here.

Wednesday, June 6, 2018

Welcome to America, where morality is judged along partisan lines

Joan Vennochi
Boston Globe
Originally posted May 8, 2018

Here some excerpts:

“It’s OK to lie to the press?” asked Stephanopoulos. To which, Giuliani replied: “Gee, I don’t know — you know a few presidents who did that.”

(cut)

Twenty years later, special counsel Robert Mueller has been investigating allegations of collusion between the Trump campaign and the Russian government. Trump’s lawyer, Cohen, is now entangled in the collusion investigation, as well as with the payment to Daniels, which also entangles Trump — who, according to Giuliani, might invoke the Fifth Amendment to avoid testifying under oath. That must be tempting, given Trump’s well-established contempt for truthfulness and personal accountability.

(cut)

So it goes in American politics, where morality is judged strictly along partisan lines, and Trump knows it.

The information is here.

Friday, March 23, 2018

Facebook Woes: Data Breach, Securities Fraud, or Something Else?

Matt Levine
Bloomberg.com
Originally posted March 21, 2018

Here is an excerpt:

But the result is always "securities fraud," whatever the nature of the underlying input. An undisclosed data breach is securities fraud, but an undisclosed sexual-harassment problem or chicken-mispricing conspiracy will get you to the same place. There is an important practical benefit to a legal regime that works like this: It makes it easy to punish bad behavior, at least by public companies, because every sort of bad behavior is also securities fraud. You don't have to prove that the underlying chicken-mispricing conspiracy was illegal, or that the data breach was due to bad security procedures. All you have to prove is that it happened, and it wasn't disclosed, and the stock went down when it was. The evaluation of the badness is in a sense outsourced to the market: We know that the behavior was illegal, not because there was a clear law against it, but because the stock went down. Securities law is an all-purpose tool for punishing corporate badness, a one-size-fits-all approach that makes all badness commensurable using the metric of stock price. It has a certain efficiency.

On the other hand it sometimes makes me a little uneasy that so much of our law ends up working this way. "In a world of dysfunctional government and pervasive financial capitalism," I once wrote, "more and more of our politics is contested in the form of securities regulation." And: "Our government's duty to its citizens is mediated by their ownership of our public companies." When you punish bad stuff because it is bad for shareholders, you are making a certain judgment about what sort of stuff is bad and who is entitled to be protected from it.

Anyway Facebook Inc. wants to make it very clear that it did not suffer a data breach. When a researcher got data about millions of Facebook users without those users' explicit permission, and when the researcher turned that data over to Cambridge Analytica for political targeting in violation of Facebook's terms, none of that was a data breach. Facebook wasn't hacked. What happened was somewhere between a contractual violation and ... you know ... just how Facebook works? There is some splitting of hairs over this, and you can understand why -- consider that SEC guidance about when companies have to disclose data breaches -- but in another sense it just doesn't matter. You don't need to know whether the thing was a "data breach" to know how bad it was. You can just look at the stock price. The stock went down...

The article is here.

Friday, September 22, 2017

I Lie? We Lie! Why? Experimental Evidence on a Dishonesty Shift in Groups

Kocher, Martin G. and Schudy, Simeon and Spantig, Lisa
CESifo Working Paper Series No. 6008.

Abstract

Unethical behavior such as dishonesty, cheating and corruption occurs frequently in organizations or groups. Recent experimental evidence suggests that there is a stronger inclination to behave immorally in groups than individually. We ask if this is the case, and if so, why. Using a parsimonious laboratory setup, we study how individual behavior changes when deciding as a group member. We observe a strong dishonesty shift. This shift is mainly driven by communication within groups and turns out to be independent of whether group members face payoff commonality or not (i.e., whether other group members benefit from one’s lie). Group members come up with and exchange more arguments for being dishonest than for complying with the norm of honesty. Thereby, group membership shifts the perception of the validity of the honesty norm and of its distribution in the population.

The article is here.

Thursday, September 21, 2017

When is a lie acceptable? Work and private life lying acceptance depends on its beneficiary

Katarzyna Cantarero, Piotr Szarota, E. Stamkou, M. Navas & A. del Carmen Dominguez Espinosa
The Journal of Social Psychology 
Pages 1-16 | Received 02 Jan 2017, Accepted 25 Apr 2017, Published online: 14 Aug 2017

ABSTRACT

In this article we show that when analyzing attitude towards lying in a cross-cultural setting, both the beneficiary of the lie (self vs other) and the context (private life vs. professional domain) should be considered. In a study conducted in Estonia, Ireland, Mexico, The Netherlands, Poland, Spain, and Sweden (N = 1345), in which participants evaluated stories presenting various types of lies, we found usefulness of relying on the dimensions. Results showed that in the joint sample the most acceptable were other-oriented lies concerning private life, then other-oriented lies in the professional domain, followed by egoistic lies in the professional domain; and the least acceptance was shown for egoistic lies regarding one’s private life. We found a negative correlation between acceptance of a behavior and the evaluation of its deceitfulness.

Here is an excerpt:

Research shows differences in reactions to moral transgressions depending on the culture of the respondent as culture influences our moral judgments (e.g., Gold, Colman, & Pulford, 2014; Graham, Meindl, Beall, Johnson, & Zhang, 2016). For example, when analyzing transgressions of community (e.g., hearing children talking with their teacher the same way as they do towards their peers) Indian participants showed more moral outrage than British participants (Laham, Chopra, Lalljee, & Parkinson, 2010). Importantly, one of the main reasons why we can observe cross-cultural differences in reactions to moral transgressions is that culture influences our perception of whether an act itself constitutes a moral transgression at all (Haidt, 2001; Haidt & Joseph, 2004; Shweder, Mahapatra, & Miller, 1987; Shweder, Much, Mahapatra, & Park, 1997). Haidt, Koller and Dias (1993) showed that Brazilian participants would perceive some acts of victimless yet offensive actions more negatively than did Americans. The authors argue that for American students some of the acts that were being evaluated (e.g., using an old flag of ones’ country to clean the bathroom) fall outside the moral domain and are only a matter of social convention, whereas Brazilians would perceive them as morally wrong.

The paper is here.

Saturday, August 26, 2017

Liars, Damned Liars, and Zealots: The Effect of Moral Mandates on Transgressive Advocacy Acceptance

Allison B. Mueller, Linda J. Skitka
Social Psychological and Personality Science 
First published date: July-25-2017

Abstract

This research explored people’s reactions to targets who “went too far” to support noble causes. We hypothesized that observers’ moral mandates would shape their perceptions of others’ advocacy, even when that advocacy was transgressive, that is, when it used norm-violating means (i.e., lying) to achieve a preferred end. Observers were expected to accept others’ advocacy, independent of its credibility, to a greater extent when it bolstered their strong (vs. weak) moral mandate. Conversely, observers with strong (vs. weak) moral conviction for the cause were expected to condemn others’ advocacy—independent of its credibility—to a greater degree when it represented progress for moral opponents. Results supported these predictions. When evaluating a target in a persuasive communication setting, people’s judgments were uniquely shaped by the degree to which the target bolstered or undermined a cherished moral mandate.

Here is part of the Discussion Section:

These findings expand our knowledge of the moral mandate effect in two key ways. First, this work suggests that the moral mandate effect extends to specific individuals, not just institutions and authorities. Moral mandates may shape people’s perceptions of any target who engages in norm-violating behaviors that uphold moralized causes: co-workers, politicians, or CEOs. Second, this research suggests that, although people are not comfortable excusing others for heinous crimes that serve a moralized end (Mullen & Skitka, 2006), they appear comparatively tolerant of norm violations like lying.

A troubling and timely implication of these findings is that political figures may be able to act in corrupt ways without damaging their images (at least in the eyes of their supporters).

The article is here.

Wednesday, July 26, 2017

Everybody lies: how Google search reveals our darkest secrets

Seth Stephens-Davidowitz
The Guardian
Originally published July 9, 2017

Everybody lies. People lie about how many drinks they had on the way home. They lie about how often they go to the gym, how much those new shoes cost, whether they read that book. They call in sick when they’re not. They say they’ll be in touch when they won’t. They say it’s not about you when it is. They say they love you when they don’t. They say they’re happy while in the dumps. They say they like women when they really like men. People lie to friends. They lie to bosses. They lie to kids. They lie to parents. They lie to doctors. They lie to husbands. They lie to wives. They lie to themselves. And they damn sure lie to surveys. Here’s my brief survey for you:

Have you ever cheated in an exam?

Have you ever fantasised about killing someone?

Were you tempted to lie?

Many people underreport embarrassing behaviours and thoughts on surveys. They want to look good, even though most surveys are anonymous. This is called social desirability bias. An important paper in 1950 provided powerful evidence of how surveys can fall victim to such bias. Researchers collected data, from official sources, on the residents of Denver: what percentage of them voted, gave to charity, and owned a library card. They then surveyed the residents to see if the percentages would match. The results were, at the time, shocking. What the residents reported to the surveys was very different from the data the researchers had gathered. Even though nobody gave their names, people, in large numbers, exaggerated their voter registration status, voting behaviour, and charitable giving.

The article is here.

Thursday, June 15, 2017

How the Science of “Blue Lies” May Explain Trump’s Support

Jeremy Adam Smith
Scientific American
Originally posted on March 24, 2017

Here are two excerpts:

This has led many people to ask themselves: How does the former reality-TV star get away with it? How can he tell so many lies and still win support from many Americans?

Journalists and researchers have suggested many answers, from a hyperbiased, segmented media to simple ignorance on the part of GOP voters. But there is another explanation that no one seems to have entertained. It is that Trump is telling “blue lies”—a psychologist’s term for falsehoods, told on behalf of a group, that can actually strengthen bonds among the members of that group.

(cut)

This research—and these stories—highlights a difficult truth about our species: we are intensely social creatures, but we are prone to divide ourselves into competitive groups, largely for the purpose of allocating resources. People can be prosocial—compassionate, empathetic, generous, honest—in their group and aggressively antisocial toward out-groups. When we divide people into groups, we open the door to competition, dehumanization, violence—and socially sanctioned deceit.

“People condone lying against enemy nations, and since many people now see those on the other side of American politics as enemies, they may feel that lies, when they recognize them, are appropriate means of warfare,” says George Edwards, a political scientist at Texas A&M University and one of the country’s leading scholars of the presidency.

The article is here.

Tuesday, January 17, 2017

When telling the truth is actually dishonest

By Jena McGregor
The Washington Post
Originally published December 29, 2016

Here is an excerpt:

The type of lie known as a lie of omission might be thought of as being similar to paltering. In both cases, the deceiver isn't telling the whole truth. But they're different, says Rogers: One is the passive failure to disclose something a negotiation counterpart doesn't know, while paltering is the active use of truthful statements to mislead.

Say you're negotiating with a buyer over a used car you're trying to sell. If the buyer says "I presume the car is in excellent shape and the engine runs well," simply failing to correct him if the engine has had problems is a lie of omission, Rogers says. But if you say "I drove it yesterday in 10-below temperatures and it drove well," even if you know it's been to the shop twice in the past month, that's paltering. Opportunities to lie by omission, Rogers says, actually "don't arise all that often."

Of course, classifying whether voters or negotiation counterparts will see "paltering" as ethical is vastly complicated by an election in which the usual standards for truth and political rhetoric seemed to be ignored. Seventy percent of the statements by President-elect Donald Trump examined by the nonpartisan fact-checking outlet Politifact have been rated mostly false, false or "pants on fire."

The article is here.

Wednesday, January 11, 2017

People Don’t Consider Lying by Omission to Be Any More Honest Than Plain Old Lying

By Cari Romm
New York Magazine: The Science of Us
Originally published December 15, 2016

Here is an excerpt:

Past research has shown that people are more willing to lie by omission than they are to tell an outright falsehood, and over a series of six experiments, the researchers found that paltering is no different — to the teller, it feels more ethical, like something between the truth and a total lie. (They also found that it’s incredibly common: In one survey administered to Harvard business students, roughly half admitted that they had previously used paltering as a negotiation strategy.)

The problem is, those on the receiving end don’t feel the same way: Across the various experiments, people who learned that their conversation partner had paltered to them said they considered the move to be just as ethically rotten as telling a bald-faced lie.

The article is here.

Monday, November 21, 2016

From porkies to whoppers: over time lies may desensitise brain to dishonesty

Hannah Devlin
The Guardian
Originally posted October 24, 2016

Here is an excerpt:

Now scientists have uncovered an explanation for why telling a few porkies has the tendency to spiral out of control. The study suggests that telling small, insignificant lies desensitises the brain to dishonesty, meaning that lying gradually feels more comfortable over time.

Tali Sharot, a neuroscientist at University College London and senior author, said: “Whether it’s evading tax, infidelity, doping in sports, making up data in science or financial fraud, deceivers often recall how small acts of dishonesty snowballed over time and they suddenly found themselves committing quite large crimes.”

Sharot and colleagues suspected that this phenomenon was due to changes in the brain’s response to lying, rather than simply being a case of one lie necessitating another to maintain a story.

The article is here.