Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Truth. Show all posts
Showing posts with label Truth. Show all posts

Thursday, March 10, 2022

Biden Team Gets It Right on Inadmissibility of Torture Evidence

Tess Bridgeman
Originally posted 1 FEB 22

The Biden administration just took an important step to restore the rule of law in the Al-Nashiri case at the Guantanamo military commissions: it categorically rejected the use of statements obtained through torture at any stage in the proceedings and promised that the government will not seek to admit any statements the petitioner made while in CIA custody. This should be unremarkable, as it clearly reflects U.S. domestic and international legal obligations and Biden administration policy, but the position the Department of Justice (DOJ) took in its brief filed in the D.C. Circuit Court of Appeals on Monday is actually an about-face from the position prosecutors took before the military commission judge. The Al-Nashiri case has a long history, but this most recent controversy stems from prosecutors’ decision to seek to admit statements obtained through torture in pre-trial proceedings in the capital case of Abd Al-Rahim Hussein Al-Nashiri, the “alleged mastermind” of the U.S.S. Cole bombing. Although the prosecution eventually withdrew the particular statements at issue, it had essentially reserved the right to rely on torture-obtained evidence in future proceedings. 

In October of last year, Al-Nashiri filed a petition for a writ of mandamus in the U.S. Court of Appeals for the District of Columbia Circuit that sought “to enjoin the government from offering, and the military commission judge from considering, torture-derived evidence.” The much-awaited U.S. government response — called a “moment of truth” for the Biden administration on torture — came yesterday. 


The government is taking the issue seriously in this case; but what about the other cases? 

The government brief states that it has “conducted a search of this case’s voluminous record, including the prosecution’s ex parte submissions” to determine whether there have been any “past orders predicated on evidence admitted in violation of” the Military Commissions Act’s prohibition of the admission of statements obtained through torture or CIDT. It found one, and has committed to “move promptly to correct” the error. This shows the administration is taking the issue seriously. 

But given al-Nashiri isn’t the only petitioner who was in the CIA’s black sites, and that the prosecution regularly makes ex parte submissions in commission proceedings, there may be instances in other cases pending before the military commissions where the same problem is lurking and could compromise the prosecution. If it isn’t doing so already, the government would be wise to undertake a thorough review of all commissions cases and withdraw any submissions it might find that contain information obtained from torture or CIDT.

Friday, May 28, 2021

‘Belonging Is Stronger Than Facts’: The Age of Misinformation

Max Fisher
The New York Times
Originally published 7 May 21

Hereis an excerpt:

We are in an era of endemic misinformation — and outright disinformation. Plenty of bad actors are helping the trend along. 

But the real drivers, some experts believe, are social and psychological forces that make people prone to sharing and believing misinformation in the first place. And those forces are on the rise.

“Why are misperceptions about contentious issues in politics and science seemingly so persistent and difficult to correct?” Brendan Nyhan, a Dartmouth College political scientist, posed in a new paper in Proceedings of the National Academy of Sciences.

It’s not for want of good information, which is ubiquitous. Exposure to good information does not reliably instill accurate beliefs anyway. Rather, Dr. Nyhan writes, a growing body of evidence suggests that the ultimate culprits are “cognitive and memory limitations, directional motivations to defend or support some group identity or existing belief, and messages from other people and political elites.”

Put more simply, people become more prone to misinformation when three things happen. First, and perhaps most important, is when conditions in society make people feel a greater need for what social scientists call ingrouping — a belief that their social identity is a source of strength and superiority, and that other groups can be blamed for their problems.

As much as we like to think of ourselves as rational beings who put truth-seeking above all else, we are social animals wired for survival. In times of perceived conflict or social change, we seek security in groups. And that makes us eager to consume information, true or not, that lets us see the world as a conflict putting our righteous ingroup against a nefarious outgroup.

This need can emerge especially out of a sense of social destabilization. As a result, misinformation is often prevalent among communities that feel destabilized by unwanted change or, in the case of some minorities, powerless in the face of dominant forces.

Framing everything as a grand conflict against scheming enemies can feel enormously reassuring. And that’s why perhaps the greatest culprit of our era of misinformation may be, more than any one particular misinformer, the era-defining rise in social polarization.

“At the mass level, greater partisan divisions in social identity are generating intense hostility toward opposition partisans,” which has “seemingly increased the political system’s vulnerability to partisan misinformation,” Dr. Nyhan wrote in an earlier paper.

Wednesday, February 3, 2021

Research on Non-verbal Signs of Lies and Deceit: A Blind Alley

T. Brennen & S. Magnussen
Front. Psychol., 14 December 2020


Research on the detection of lies and deceit has a prominent place in the field of psychology and law with a substantial research literature published in this field of inquiry during the last five to six decades (Vrij, 2000, 2008; Vrij et al., 2019). There are good reasons for this interest in lie detection. We are all everyday liars, some of us more prolific than others, we lie in personal and professional relationships (Serota et al., 2010; Halevy et al., 2014; Serota and Levine, 2015; Verigin et al., 2019), and lying in public by politicians and other public figures has a long and continuing history (Peters, 2015). However, despite the personal problems that serious everyday lies may cause and the human tragedies political lies may cause, it is lying in court that appears to have been the principal initial motivation for the scientific interest in lie detection.

Lying in court is a threat to fair trials and the rule of law. Lying witnesses may lead to the exoneration of guilty persons or to the conviction of innocent ones. In the US it is well-documented that innocent people have been convicted because witnesses were lying in court (Garrett, 2010, 2011; www.innocenceproject.com). In evaluating the reliability and the truthfulness of a testimony, the court considers other evidence presented to the court, the known facts about the case and the testimonies by other witnesses. Inconsistency with the physical evidence or the testimonies of other witnesses might indicate that the witness is untruthful, or it may simply reflect the fact that the witness has observed, interpreted, and later remembered the critical events incorrectly—normal human errors all too well known in the eyewitness literature (Loftus, 2005; Wells and Loftus, 2013; Howe and Knott, 2015).

(as it ends)

Is the rational course simply to drop this line of research? We believe it is. The creative studies carried out during the last few decades have been important in showing that psychological folklore, the ideas we share about behavioral signals of lies and deceit are not correct. This debunking function of science is extremely important. But we have now sufficient evidence that there are no specific non-verbal behavioral signals that accompany lying or deceitful behavior. We can safely recommend that courts disregard such behavioral signals when appraising the credibility of victims, witnesses, and suspected offenders. For psychology and law researchers it may be time to move on.

Friday, June 26, 2020

Debunking the Secular Case for Religion

Gurwinder Bhogal
Originally published 28 April 20

Here is an excerpt:

Could we, perhaps, identify the religious traditions that protect civilizations by looking at our history and finding the practices common to all long-lived civilizations? After all, Taleb has claimed that religion is “Lindy;” that is to say it has endured for a long time and therefore must be robust. But the main reason religious teachings have been robust is not that they’ve stood the test of time, but that those who tried to change them tended to be killed. Taleb also doesn’t explain what happens when religious practices differ or clash. Should people follow the precepts of the hardline Wahhabi brand of Islam, or those of a more moderate one? If the Abrahamic religions agree that usury leads to recessions, which of them do we consult on eating pork? Do we follow the Old Testament’s no or the New Testament’s yes, the green light of Christianity or the red light of Islam and Judaism?

Neither Taleb nor Peterson appear to answer these questions. But many evolutionary psychologists have: they say we should not blindly accept any religious edict, because none contain any inherent wisdom. The dominant view among evolutionary psychologists is that religion is not an evolutionary adaptation but a “spandrel,” a by-product of other adaptations. Richard Dawkins has compared religion to the tendency of moths to fly into flames: the moth did not evolve to fly into flames; it evolved to navigate by the light of the moon. Since it’s unable to distinguish between moonlight and candlelight, its attempt to keep a candle-flame in a fixed ommatidium (unit of a compound eye) causes it to keep veering around the flame, until it spirals into it. Dawkins argues that religion didn’t evolve for a purpose; it merely exploits the actual systems we evolved to navigate the world. An example of such a system might be what psychologist Justin Barrett calls the Hyperactive Agent Detection Device, the propensity to see natural phenomena as products of design. Basically, in our evolutionary history, mistaking a natural phenomenon for an artifact was far less risky than mistaking an artifact for a natural phenomenon, so our brains erred toward the former.

The info is here.

Friday, May 8, 2020

Repetition increases Perceived Truth even for Known Falsehoods

Lisa Fazio
Originally posted 23 March 20

Repetition increases belief in false statements. This illusory truth effect occurs with many different types of statements (e.g., trivia facts, news headlines, advertisements), and even occurs when the false statement contradicts participants’ prior knowledge. However, existing studies of the effect of prior knowledge on the illusory truth effect share a common flaw; they measure participants’ knowledge after the experimental manipulation and thus conditionalize responses on posttreatment variables. In the current study, we measure prior knowledge prior to the experimental manipulation and thus provide a cleaner measurement of the causal effect of repetition on belief. We again find that prior knowledge does not protect against the illusory truth effect. Repeated false statements were given higher truth ratings than novel statements, even when they contradicted participants’ prior knowledge.

From the Discussion

As in previous research (Brashier et al., 2017; Fazio et al., 2015), prior knowledge did not protect participants from the illusory truth effect.Repeated falsehoods were rated as being more true than novel falsehoods, even when they both contradicted participants’ prior knowledge. By measuring prior knowledge before the experimental session, this study avoids conditioning on posttreatment variables and provides cleaner evidence for the effect (Montgomery et al., 2018). Whether prior knowledge is measured before or after the manipulation, it is clear that repetition increases belief in falsehoods that contradict existing knowledge.

The research is here.

Wednesday, May 6, 2020

The coming battle for the COVID-19 narrative

Samule Bowles & Wendy Carlin
Originally posted 10 April 20

The COVID-19 pandemic is a blow to self-interest as a value orientation and laissez-faire as a policy paradigm, both already reeling amid mounting public concerns about climate change.  Will the pandemic change our economic narrative, expressing new everyday understandings of how the economy works and how it should work? 

We think so. But it will not be simply a shift to the left on the now anachronistic one-dimensional markets-versus-government continuum shown in Figure 1. A position along the blue line represents a mix of public policies – nationalisation of the railways, for example, towards the left; deregulation of labour markets, for example, towards the right.

COVID-19, for better or worse, brings into focus a third pole in the debate: call it community or civil society. In the absence of this third pole, the conventional language of economics and public policy misses the contribution of social norms and of institutions that are neither governments nor markets – like families, relationships within firms, and community organisations.

There are precedents for the scale of changes that we anticipate. The Great Depression and WWII changed the way we talked about the economy: left to its own devices it would wreak havoc on people’s lives (massive unemployment), “heedless self-interest [is] bad economics” (FDR),1 and governments can effectively pursue the public good (defeat fascism, provide economic security). As the memories of that era faded along with the social solidarity and confidence in collective action that it had fostered, another vernacular took over: “there is no such thing as society” (Thatcher) – you get what you pay for, government is just another special interest group.

Another opportunity for a long-needed fundamental shift in the economic vernacular is now unfolding. COVID-19, along with climate change, could be the equivalent of the Great Depression and WWII in forcing a sea change in economic thinking and policy.

The info is here.

Friday, May 1, 2020

The therapist's dilemma: Tell the whole truth?

Image result for psychotherapyJackson, D.
J. Clin. Psychol. 2020; 76: 286– 291.


Honest communication between therapist and client is foundational to good psychotherapy. However, while past research has focused on client honesty, the topic of therapist honesty remains almost entirely untouched. Our lab's research seeks to explore the role of therapist honesty, how and why therapists make decisions about when to be completely honest with clients (and when to abstain from telling the whole truth), and the perceived consequences of these decisions. This article reviews findings from our preliminary research, presents a case study of the author's honest disclosure dilemma, and discusses the role of therapeutic tact and its function in the therapeutic process.

Here is an excerpt:

Based on our preliminary research, one of the most common topics of overt dishonesty among therapists was their feelings of frustration or disappointment toward their clients. For example, a therapist working with a client with a diagnosis of avoidant personality disorder may find herself increasingly frustrated by the client’s continual resistance to discussing emotional topics or engaging in activities that would broaden his or her world. Such a client —let’s assume male—is also likely to feel preoccupied with concerns about whether the therapist “likes” him or feels as frustrated with him as he does with himself. Should this client apologize for his behavior and ask if the therapist is frustrated with him, the therapist may feel compelled to reduce the discomfort he is already experiencing by dispelling his concern: “No, it’s okay, I’m not frustrated.”

But either at this moment or at a later point in therapy, once rapport (i.e., the therapeutic alliance) has been more firmly established, a more honest answer to this question might be fruitful: “Yes, I am feeling frustrated that we haven’t been able to find ways for you to implement the changes we discuss here, outside of session. How does it feel for you to hear that I am feeling frustrated?” Or, arguably, an even more honest answer: “Yes, I am sometimes frustrated. I sometimes think we could go deeper here—I think it’d be helpful.” Or, an honest answer that is somewhat less critical of the patient and more self‐focused: “I do feel frustrated that I haven’t been able to be more helpful.” Clearly, there are many ways for a therapist to be honest and/or dishonest, and there are also gradations in whichever direction a therapist chooses.

Thursday, August 22, 2019

Repetition increases perceived truth equally for plausible and implausible statements

Lisa Fazio David Rand Gordon Pennycook
Originally created February 28, 2019


Repetition increases the likelihood that a statement will be judged as true. This illusory truth effect is well-established; however, it has been argued that repetition will not affect belief in unambiguous statements. When individuals are faced with obviously true or false statements, repetition should have no impact. We report a simulation study and a preregistered experiment that investigate this idea. Contrary to many intuitions, our results suggest that belief in all statements is increased by repetition. The observed illusory truth effect is largest for ambiguous items, but this can be explained by the psychometric properties of the task, rather than an underlying psychological mechanism that blocks the impact of repetition for implausible items. Our results indicate that the illusory truth effect is highly robust and occurs across all levels of plausibility. Therefore, even highly implausible statements will become more plausible with enough repetition.

The research is here.

The conclusion:

In conclusion, our findings are consistent with the hypothesis that repetition increases belief in all statements equally, regardless of their plausibility. However, there is an important difference between this internal mechanism (equal increase across plausibility) and the observable effect. The observable effect of repetition on truth ratings is greatest for items near the midpoint of perceived truth, and small or nonexistent for items at the extremes. While repetition effects are difficult to observe for very high and very low levels of perceived truth, our results suggest that repetition increases participants’ internal representation of truth equally for all statements. These findings have large implications for daily life where people are often repeatedly exposed to both plausible and implausible falsehoods. Even implausible falsehoods may slowly become more plausible with repetition.

Tuesday, February 12, 2019

How to tell the difference between persuasion and manipulation

Robert Noggle
Originally published August 1, 2018

Here is an excerpt:

It appears, then, that whether an influence is manipulative depends on how it is being used. Iago’s actions are manipulative and wrong because they are intended to get Othello to think and feel the wrong things. Iago knows that Othello has no reason to be jealous, but he gets Othello to feel jealous anyway. This is the emotional analogue to the deception that Iago also practises when he arranges matters (eg, the dropped handkerchief) to trick Othello into forming beliefs that Iago knows are false. Manipulative gaslighting occurs when the manipulator tricks another into distrusting what the manipulator recognises to be sound judgment. By contrast, advising an angry friend to avoid making snap judgments before cooling off is not acting manipulatively, if you know that your friend’s judgment really is temporarily unsound. When a conman tries to get you to feel empathy for a non-existent Nigerian prince, he acts manipulatively because he knows that it would be a mistake to feel empathy for someone who does not exist. Yet a sincere appeal to empathy for real people suffering undeserved misery is moral persuasion rather than manipulation. When an abusive partner tries to make you feel guilty for suspecting him of the infidelity that he just committed, he is acting manipulatively because he is trying to induce misplaced guilt. But when a friend makes you feel an appropriate amount of guilt over having deserted him in his hour of need, this does not seem manipulative.

The info is here.

Friday, November 9, 2018

Believing without evidence is always morally wrong

Francisco Mejia Uribe
Originally posted November 5, 2018

Here are two excerpts:

But it is not only our own self-preservation that is at stake here. As social animals, our agency impacts on those around us, and improper believing puts our fellow humans at risk. As Clifford warns: ‘We all suffer severely enough from the maintenance and support of false beliefs and the fatally wrong actions which they lead to …’ In short, sloppy practices of belief-formation are ethically wrong because – as social beings – when we believe something, the stakes are very high.


Translating Clifford’s warning to our interconnected times, what he tells us is that careless believing turns us into easy prey for fake-news peddlers, conspiracy theorists and charlatans. And letting ourselves become hosts to these false beliefs is morally wrong because, as we have seen, the error cost for society can be devastating. Epistemic alertness is a much more precious virtue today than it ever was, since the need to sift through conflicting information has exponentially increased, and the risk of becoming a vessel of credulity is just a few taps of a smartphone away.

Clifford’s third and final argument as to why believing without evidence is morally wrong is that, in our capacity as communicators of belief, we have the moral responsibility not to pollute the well of collective knowledge. In Clifford’s time, the way in which our beliefs were woven into the ‘precious deposit’ of common knowledge was primarily through speech and writing. Because of this capacity to communicate, ‘our words, our phrases, our forms and processes and modes of thought’ become ‘common property’. Subverting this ‘heirloom’, as he called it, by adding false beliefs is immoral because everyone’s lives ultimately rely on this vital, shared resource.

The info is here.

Wednesday, July 25, 2018

Descartes was wrong: ‘a person is a person through other persons’

Abeba Birhane
Originally published April 7, 2017

Here is an excerpt:

So reality is not simply out there, waiting to be uncovered. ‘Truth is not born nor is it to be found inside the head of an individual person, it is born between people collectively searching for truth, in the process of their dialogic interaction,’ Bakhtin wrote in Problems of Dostoevsky’s Poetics (1929). Nothing simply is itself, outside the matrix of relationships in which it appears. Instead, being is an act or event that must happen in the space between the self and the world.

Accepting that others are vital to our self-perception is a corrective to the limitations of the Cartesian view. Consider two different models of child psychology. Jean Piaget’s theory of cognitive development conceives of individual growth in a Cartesian fashion, as the reorganisation of mental processes. The developing child is depicted as a lone learner – an inventive scientist, struggling independently to make sense of the world. By contrast, ‘dialogical’ theories, brought to life in experiments such as Lisa Freund’s ‘doll house study’ from 1990, emphasise interactions between the child and the adult who can provide ‘scaffolding’ for how she understands the world.

A grimmer example might be solitary confinement in prisons. The punishment was originally designed to encourage introspection: to turn the prisoner’s thoughts inward, to prompt her to reflect on her crimes, and to eventually help her return to society as a morally cleansed citizen. A perfect policy for the reform of Cartesian individuals.

The information is here.

Friday, July 20, 2018

How to Look Away

Megan Garber
The Atlantic
Originally published June 20, 2018

Here is an excerpt:

It is a dynamic—the democratic alchemy that converts seeing things into changing them—that the president and his surrogates have been objecting to, as they have defended their policy. They have been, this week (with notable absences), busily appearing on cable-news shows and giving disembodied quotes to news outlets, insisting that things aren’t as bad as they seem: that the images and the audio and the evidence are wrong not merely ontologically, but also emotionally. Don’t be duped, they are telling Americans. Your horror is incorrect. The tragedy is false. Your outrage about it, therefore, is false. Because, actually, the truth is so much more complicated than your easy emotions will allow you to believe. Actually, as Fox News host Laura Ingraham insists, the holding pens that seem to house horrors are “essentially summer camps.” And actually, as Fox & Friends’ Steve Doocy instructs, the pens are not cages so much as “walls” that have merely been “built … out of chain-link fences.” And actually, Kirstjen Nielsen wants you to remember, “We provide food, medical, education, all needs that the child requests.” And actually, too—do not be fooled by your own empathy, Tom Cotton warns—think of the child-smuggling. And of MS-13. And of sexual assault. And of soccer fields. There are so many reasons to look away, so many other situations more deserving of your outrage and your horror.

It is a neat rhetorical trick: the logic of not in my backyard, invoked not merely despite the fact that it is happening in our backyard, but because of it. With seed and sod that we ourselves have planted.

Yes, yes, there are tiny hands, reaching out for people who are not there … but those are not the point, these arguments insist and assure. To focus on those images—instead of seeing the system, a term that Nielsen and even Trump, a man not typically inclined to think in networked terms, have been invoking this week—is to miss the larger point.

The article is here.

Tuesday, June 5, 2018

Is There Such a Thing as Truth?

Errol Morris
Boston Review
Originally posted April 30, 2018

Here is an excerpt:

In fiction, we are often given an imaginary world with seemingly real objects—horses, a coach, a three-cornered hat and wig. But what about the objects of science—positrons, neutrinos, quarks, gravity waves, Higgs bosons? How do we reckon with their reality?

And truth. Is there such a thing? Can we speak of things as unambiguously true or false? In history, for example, are there things that actually happened? Louis XVI guillotined on January 21, 1793, at what has become known as the Place de la Concorde. True or false? Details may be disputed—a more recent example: how large, comparatively, was Donald Trump’s victory in the electoral college in 2016, or the crowd at his inauguration the following January? 
But do we really doubt that Louis’s bloody head was held up before the assembled crowd? Or doubt the existence of the curved path of a positron in a bubble chamber? Even though we might not know the answers to some questions—“Was Louis XVI decapitated?” or “Are there positrons?”—we accept that there are answers.

And yet, we read about endless varieties of truth. Coherence theories of truth. Pragmatic, relative truths. Truths for me, truths for you. Dog truths, cat truths. Whatever. I find these discussions extremely distasteful and unsatisfying. To say that a philosophical system is “coherent” tells me nothing about whether it is true. Truth is not hermetic. I cannot hide out in a system and assert its truth. For me, truth is about the relation between language and the world. A correspondence idea of truth. Coherence theories of truth are of little or no interest to me. Here is the reason: they are about coherence, not truth. We are talking about whether a sentence or a paragraph
 or group of paragraphs is true when set up against the world. Thackeray, introducing the fictional world of Vanity Fair, evokes the objects of a world he is familiar with—“a large family coach, with two fat horses in blazing harnesses, driven by a fat coachman in a three-cornered hat and wig, at the rate of four miles an hour.”

The information is here.

Friday, May 18, 2018

You don’t have a right to believe whatever you want to

Daniel DeNicola
Originally published May 14, 2018

Here is the conclusion:

Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas; that Barack Obama is Muslim; that the Earth is flat; or that climate change is a hoax. In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.

Believing, like willing, seems fundamental to autonomy, the ultimate ground of one’s freedom. But, as Clifford also remarked: ‘No one man’s belief is in any case a private matter which concerns himself alone.’ Beliefs shape attitudes and motives, guide choices and actions. Believing and knowing are formed within an epistemic community, which also bears their effects. There is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs – and that ethic both generates and limits our right to believe. If some beliefs are false, or morally repugnant, or irresponsible, some beliefs are also dangerous. And to those, we have no right.

The information is here.

Wednesday, May 16, 2018

Escape the Echo Chamber

C Thi Nguyen
Originally posted April 12, 2018

Something has gone wrong with the flow of information. It’s not just that different people are drawing subtly different conclusions from the same evidence. It seems like different intellectual communities no longer share basic foundational beliefs. Maybe nobody cares about the truth anymore, as some have started to worry. Maybe political allegiance has replaced basic reasoning skills. Maybe we’ve all become trapped in echo chambers of our own making — wrapping ourselves in an intellectually impenetrable layer of likeminded friends and web pages and social media feeds.

But there are two very different phenomena at play here, each of which subvert the flow of information in very distinct ways. Let’s call them echo chambers and epistemic bubbles. Both are social structures that systematically exclude sources of information. Both exaggerate their members’ confidence in their beliefs. But they work in entirely different ways, and they require very different modes of intervention. An epistemic bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trustpeople from the other side.

Current usage has blurred this crucial distinction, so let me introduce a somewhat artificial taxonomy. An ‘epistemic bubble’ is an informational network from which relevant voices have been excluded by omission. That omission might be purposeful: we might be selectively avoiding contact with contrary views because, say, they make us uncomfortable. As social scientists tell us, we like to engage in selective exposure, seeking out information that confirms our own worldview. But that omission can also be entirely inadvertent. Even if we’re not actively trying to avoid disagreement, our Facebook friends tend to share our views and interests. When we take networks built for social reasons and start using them as our information feeds, we tend to miss out on contrary views and run into exaggerated degrees of agreement.

The information is here.

Monday, April 16, 2018

The Seth Rich lawsuit matters more than the Stormy Daniels case

Jill Abramson
The Guardian
Originally published March 20, 2018

Here is an excerpt:

I’ve previously written about Fox News’ shameless coverage of the 2016 unsolved murder of a young former Democratic National Committee staffer named Seth Rich. Last week, ABC News reported that his family has filed a lawsuit against Fox, charging that several of its journalists fabricated a vile story attempting to link the hacked emails from Democratic National Committee computers to Rich, who worked there.

After the fabricated story ran on the Fox website, it was retracted, but not before various on-air stars, especially Trump mouthpiece Sean Hannity, flogged the bogus conspiracy theory suggesting Rich had something to do with the hacked messages.

This shameful episode demonstrated, once again, that Rupert Murdoch’s favorite network, and Trump’s, has no ethical compass and had no hesitation about what grief this manufactured story caused to the 26-year-old murder victim’s family. It’s good to see them striking back, since that is the only tactic that the Murdochs and Trumps of the world will respect or, perhaps, will force them to temper the calumny they spread on a daily basis.

Of course, the Rich lawsuit does not have the sex appeal of the Stormy case. The rightwing echo chamber will brazenly ignore its self-inflicted wounds. And, for the rest of the cable pundit brigades, the DNC emails and Rich are old news.

The article is here.

Monday, April 2, 2018

The Grim Conclusions of the Largest-Ever Study of Fake News

Robinson Meyer
The Atlantic
Originally posted March 8, 2018

Here is an excerpt:

“It seems to be pretty clear [from our study] that false information outperforms true information,” said Soroush Vosoughi, a data scientist at MIT who has studied fake news since 2013 and who led this study. “And that is not just because of bots. It might have something to do with human nature.”

The study has already prompted alarm from social scientists. “We must redesign our information ecosystem for the 21st century,” write a group of 16 political scientists and legal scholars in an essay also published Thursday in Science. They call for a new drive of interdisciplinary research “to reduce the spread of fake news and to address the underlying pathologies it has revealed.”

“How can we create a news ecosystem … that values and promotes truth?” they ask.

The new study suggests that it will not be easy. Though Vosoughi and his colleagues only focus on Twitter—the study was conducted using exclusive data which the company made available to MIT—their work has implications for Facebook, YouTube, and every major social network. Any platform that regularly amplifies engaging or provocative content runs the risk of amplifying fake news along with it.

The article is here.

Thursday, February 15, 2018

Declining Trust in Facts, Institutions Imposes Real-World Costs on U.S. Society

Rand Corporation
Released on January 16, 2018

Americans' reliance on facts to discuss public issues has declined significantly in the past two decades, leading to political paralysis and collapse of civil discourse, according to a RAND Corporation report.

This phenomenon, referred to as “Truth Decay,” is defined by increasing disagreement about facts, a blurring between opinion and fact, an increase in the relative volume of opinion and personal experience over fact, and declining trust in formerly respected sources of factual information.

While there is evidence of similar phenomena in earlier eras in U.S. history, the current manifestation of Truth Decay is exacerbated by changes in the ways Americans consume information—particularly via social media and cable news. Other influences that may make Truth Decay more intense today include political, economic and social polarization that segment and divide the citizenry, the study finds.

These factors lead to Truth Decay's damaging consequences, such as political paralysis and uncertainty in national policy, which incur real costs. The government shutdown of 2013, which lasted 16 days, resulted in a $20 billion loss to the U.S. economy, according to estimates cited in the study.

The pressor is here.

Tuesday, August 29, 2017

Must science be testable?

Massimo Pigliucci
Originally published August 10, 2016

Here is an excerpt:

hat said, the publicly visible portion of the physics community nowadays seems split between people who are openly dismissive of philosophy and those who think they got the pertinent philosophy right but their ideological opponents haven’t. At stake isn’t just the usually tiny academic pie, but public appreciation of and respect for both the humanities and the sciences, not to mention millions of dollars in research grants (for the physicists, not the philosophers). Time, therefore, to take a more serious look at the meaning of Popper’s philosophy and why it is still very much relevant to science, when properly understood.

As we have seen, Popper’s message is deceptively simple, and – when repackaged in a tweet – has in fact deceived many a smart commentator in underestimating the sophistication of the underlying philosophy. If one were to turn that philosophy into a bumper sticker slogan it would read something like: ‘If it ain’t falsifiable, it ain’t science, stop wasting your time and money.’

But good philosophy doesn’t lend itself to bumper sticker summaries, so one cannot stop there and pretend that there is nothing more to say. Popper himself changed his mind throughout his career about a number of issues related to falsification and demarcation, as any thoughtful thinker would do when exposed to criticisms and counterexamples from his colleagues. For instance, he initially rejected any role for verification in establishing scientific theories, thinking that it was far too easy to ‘verify’ a notion if one were actively looking for confirmatory evidence. Sure enough, modern psychologists have a name for this tendency, common to laypeople as well as scientists: confirmation bias.

Nonetheless, later on Popper conceded that verification – especially of very daring and novel predictions – is part of a sound scientific approach. After all, the reason Einstein became a scientific celebrity overnight after the 1919 total eclipse is precisely because astronomers had verified the predictions of his theory all over the planet and found them in satisfactory agreement with the empirical data.

The article is here.

Thursday, June 22, 2017

Teaching Humility in an Age of Arrogance

Michael Patrick Lynch
The Chronicle of Higher Education
Originally published June 5, 2017

Here is an excerpt:

Our cultural embrace of epistemic or intellectual arrogance is the result of a toxic mix of technology, psychology, and ideology. To combat it, we have to reconnect with some basic values, including ones that philosophers have long thought were essential both to serious intellectual endeavors and to politics.

One of those ideas, as I just noted, is belief in objective truth. But another, less-noted concept is intellectual humility. By intellectual humility, I refer to a cluster of attitudes that we can take toward ourselves — recognizing your own fallibility, realizing that you don’t really know as much as you think, and owning your limitations and biases.

But being intellectually humble also means taking an active stance. It means seeing your worldview as open to improvement by the evidence and experience of other people. Being open to improvement is more than just being open to change. And it isn’t just a matter of self-improvement — using your genius to know even more. It is a matter of seeing your view as capable of improvement because of what others contribute.

Intellectual humility is not the same as skepticism. Improving your knowledge must start from a basis of rational conviction. That conviction allows you to know when to stop inquiring, when to realize that you know enough — that the earth really is round, the climate is warming, the Holocaust happened, and so on. That, of course, is tricky, and many a mistake in science and politics have been made because someone stopped inquiring before they should have. Hence the emphasis on evidence; being intellectually humble requires being responsive to the actual evidence, not to flights of fancy or conspiracy theories.

The article is here.