Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Knowledge. Show all posts
Showing posts with label Knowledge. Show all posts

Thursday, April 19, 2018

Common Sense for A.I. Is a Great Idea

Carissa Veliz
www.slate.com
Originally posted March 19, 2018

At the moment, artificial intelligence may have perfect memories and be better at arithmetic than us, but they are clueless. It takes a few seconds of interaction with any digital assistant to realize one is not in the presence of a very bright interlocutor. Among some of the unexpected items users have found in their shopping lists after talking to (or near) Amazon’s Alexa are 150,000 bottles of shampoo, sled dogs, “hunk of poo,” and a girlfriend.

The mere exasperation of talking to a digital assistant can be enough to miss human companionship, feel nostalgia of all things analog and dumb, and foreswear any future attempts at communicating with mindless pieces of metal inexplicably labelled “smart.” (Not to mention all the privacy issues.) A.I. not understanding what a shopping list is, and the kinds of items that are appropriate to such lists, is evidence of a much broader problem: They lack common sense.

The Allen Institute for Artificial Intelligence, or AI2, created by Microsoft co-founder Paul Allen, has announced it is embarking on a new research $125 million initiative to try to change that. “To make real progress in A.I., we have to overcome the big challenges in the area of common sense,” Allen told the New York Times. AI2 takes common sense to include the “infinite set of facts, heuristics, observations … that we bring to the table when we address a problem, but the computer doesn’t.” Researchers will use a combination of crowdsourcing, machine learning, and machine vision to create a huge “repository of knowledge” that will bring about common sense. Of paramount importance among its uses is to get A.I. to “understand what’s harmful to people.”

The information is here.

Monday, March 12, 2018

Train PhD students to be thinkers not just specialists

Gundula Bosch
nature.com
Originally posted February 14, 2018

Under pressure to turn out productive lab members quickly, many PhD programmes in the biomedical sciences have shortened their courses, squeezing out opportunities for putting research into its wider context. Consequently, most PhD curricula are unlikely to nurture the big thinkers and creative problem-solvers that society needs.

That means students are taught every detail of a microbe’s life cycle but little about the life scientific. They need to be taught to recognize how errors can occur. Trainees should evaluate case studies derived from flawed real research, or use interdisciplinary detective games to find logical fallacies in the literature. Above all, students must be shown the scientific process as it is — with its limitations and potential pitfalls as well as its fun side, such as serendipitous discoveries and hilarious blunders.

This is exactly the gap that I am trying to fill at Johns Hopkins University in Baltimore, Maryland, where a new graduate science programme is entering its second year. Microbiologist Arturo Casadevall and I began pushing for reform in early 2015, citing the need to put the philosophy back into the doctorate of philosophy: that is, the ‘Ph’ back into the PhD.

The article is here.

Monday, March 5, 2018

Donald Trump and the rise of tribal epistemology

David Roberts
Vox.com
Originally posted May 19, 2017 and still extremely important

Here is an excerpt:

Over time, this leads to what you might call tribal epistemology: Information is evaluated based not on conformity to common standards of evidence or correspondence to a common understanding of the world, but on whether it supports the tribe’s values and goals and is vouchsafed by tribal leaders. “Good for our side” and “true” begin to blur into one.

Now tribal epistemology has found its way to the White House.

Donald Trump and his team represent an assault on almost every American institution — they make no secret of their desire to “deconstruct the administrative state” — but their hostility toward the media is unique in its intensity.

It is Trump’s obsession and favorite target. He sees himself as waging a “running war” on the mainstream press, which his consigliere Steve Bannon calls “the opposition party.”

The article is here.

Tuesday, February 27, 2018

After long battle, mental health will be part of New York's school curriculum

Bethany Bump
Times Union
Originally published January 27, 2018

Here is an excerpt:

The idea of teaching young people about mental health is not a new one.

The mental hygiene movement of the early 1900s introduced society to the concept that mental wellness could be just as important as physical wellness.,

In 1928, a nationwide group of superintendents recommended that mental hygiene be included in the teaching of health education, but it was not.

"When you talk about mental health and mental illness, people are still, because of the stigma, in the closet about it," Liebman said. "People just don't talk about it like they talk about physical illness."

Social media has strengthened the movement to de-stigmatize mental illness, he said. "People are being more candid about their mental health issues and seeking support and using social media as kind of a fulcrum for gaining support, peers and friends in their recovery," Liebman said.

Making the case

Advocates of the law want people to know they are not pushing for students or schoolteachers to become diagnosticians. They say that is best left to professionals.

Adding mental health literacy to the curriculum will provide youth with the knowledge of how to prevent mental disorders, recognize when a disorder is developing, know how and where to seek help and treatment, strategies for dealing with milder issues, and strategies for supporting others who are struggling.

The information is here.

Monday, February 26, 2018

Business ethics: am I boring you?

Katherine Bradshaw
The Guardian
Originally published November 8, 2012

Here is an excerpt:

We need to bridge the gap between ethics programmes and daily worklife – and stories can help us do that.

No matter how sophisticated we are as a society, stories continue to be our preferred way of communicating and sharing our experiences of life. From a book at bedtime to the latest cliffhanger of our favourite soap, stories help us connect and communicate our emotions and values with each other.

Business ethics training at its worst can include material which seems distant to staff and how they do their day-to-day job. A set of compliance dictats communicated with slides animated with clip art, or an eLearning programme with easy multiple choice questions conducted in isolation, is unlikely to engage anyone with what really matters.

Ethical values need to be embedded into company culture so that they are reflected in the way that business is actually done. This requires an ethics programme with objectives beyond just imparting knowledge and raising awareness of expected standards – the challenge is to communicate their relevance and importance at all levels and locations in a way that impacts on understanding, decisions and behaviours.

The article is here.

Friday, February 16, 2018

The Scientism of Psychiatry

Sami Timimi
Mad in America
Originally posted January 10, 2018

Here is an excerpt:

Mainstream psychiatry has been afflicted by at least two types of scientism. Firstly, it parodies science as ideology, liking to talk in scientific language, using the language of EBM, and carrying out research that ‘looks’ scientific (such as brain scanning). Psychiatry wants to be seen as residing in the same scientific cosmology as the rest of medicine. Yet the cupboard of actual clinically relevant findings remains pretty empty. Secondly, it ignores much of the genuine science there is and goes on supporting and perpetuating concepts and treatments that have little scientific support. This is a more harmful and deceptive form of scientism; it means that psychiatry likes to talk in the language of science and treats this as more important than the actual science.

I have had debates with fellow psychiatrists on many aspects of the actual evidence base. Two ‘defences’ have become familiar to me. The first is use of anecdote — such and such a patient got better with such and such a treatment, therefore, this treatment ‘works.’ Anecdote is precisely what EBM was trying to get away from. The second is an appeal for me to take a ‘balanced’ perspective. Of course each person’s idea of what is a ‘balanced’ position depends on where they are sitting. We get our ideas on what is ‘balanced’ from what is culturally dominant, not from what the science is telling us. At one point, to many people, Nelson Mandala was a violent terrorist; later to many more people, he becomes the embodiment of peaceful reconciliation and forgiveness. What were considered ‘balanced’ views on him were almost polar opposites, depending on where and when you were examining him from. Furthermore, in science facts are simply that. Our interpretations are of course based on our reading of these facts. Providing an interpretation consistent with the facts is more important than any one person’s notion of what a ‘balanced’ position should look like.

The article is here.

Thursday, February 15, 2018

Declining Trust in Facts, Institutions Imposes Real-World Costs on U.S. Society

Rand Corporation
Pressor
Released on January 16, 2018

Americans' reliance on facts to discuss public issues has declined significantly in the past two decades, leading to political paralysis and collapse of civil discourse, according to a RAND Corporation report.

This phenomenon, referred to as “Truth Decay,” is defined by increasing disagreement about facts, a blurring between opinion and fact, an increase in the relative volume of opinion and personal experience over fact, and declining trust in formerly respected sources of factual information.

While there is evidence of similar phenomena in earlier eras in U.S. history, the current manifestation of Truth Decay is exacerbated by changes in the ways Americans consume information—particularly via social media and cable news. Other influences that may make Truth Decay more intense today include political, economic and social polarization that segment and divide the citizenry, the study finds.

These factors lead to Truth Decay's damaging consequences, such as political paralysis and uncertainty in national policy, which incur real costs. The government shutdown of 2013, which lasted 16 days, resulted in a $20 billion loss to the U.S. economy, according to estimates cited in the study.

The pressor is here.

Thursday, February 1, 2018

How to Counter the Circus of Pseudoscience

Lisa Pryor
The New York Times
Originally published January 5, 2018

Here are two excerpts:

In the face of such doubt, it is not surprising that some individuals, even those who are intelligent and well educated, are swept away by the breezy confidence of health gurus, who are full of passionate intensity while the qualified lack all conviction, to borrow from Yeats.

It is a cognitive bias known in psychology as the Dunning-Kruger Effect. In short, the less you know, the less able you are to recognize how little you know, so the less likely you are to recognize your errors and shortcomings. For the highly skilled, like trained scientists, the opposite is true: The more you know, the more likely you are to see how little you know. This is truly a cognitive bias for our time.

(cut)

Engaging is difficult when the alternative-health proponents are on such a different astral plane that it is a challenge even to find common language for a conversation, especially when they promote spurious concepts such as “pyrrole disease,” which they can speak about in great, false detail, drawing the well-informed physician, dietitian or scientist into a vortex of personal anecdote and ancient wisdom, with quips about big pharma thrown in for good measure.

The information is here.

Friday, January 26, 2018

Power Causes Brain Damage

Jerry Useem
The Atlantic
Originally published July 2017

Here is an excerpt:

This is a depressing finding. Knowledge is supposed to be power. But what good is knowing that power deprives you of knowledge?

The sunniest possible spin, it seems, is that these changes are only sometimes harmful. Power, the research says, primes our brain to screen out peripheral information. In most situations, this provides a helpful efficiency boost. In social ones, it has the unfortunate side effect of making us more obtuse. Even that is not necessarily bad for the prospects of the powerful, or the groups they lead. As Susan Fiske, a Princeton psychology professor, has persuasively argued, power lessens the need for a nuanced read of people, since it gives us command of resources we once had to cajole from others. But of course, in a modern organization, the maintenance of that command relies on some level of organizational support. And the sheer number of examples of executive hubris that bristle from the headlines suggests that many leaders cross the line into counterproductive folly.

Less able to make out people’s individuating traits, they rely more heavily on stereotype. And the less they’re able to see, other research suggests, the more they rely on a personal “vision” for navigation. John Stumpf saw a Wells Fargo where every customer had eight separate accounts. (As he’d often noted to employees, eight rhymes with great.) “Cross-selling,” he told Congress, “is shorthand for deepening relationships.”

The article is here.

Friday, December 29, 2017

Freud in the scanner

M. M. Owen
aeon.co
Originally published December 7, 2017

Here is an excerpt:

This is why Freud is less important to the field than what Freud represents. Researching this piece, I kept wondering: why hang on to Freud? He is an intensely polarising figure, so polarising that through the 1980s and ’90s there raged the so-called Freud Wars, fighting on one side of which were a whole team of authors driven (as the historian of science John Forrester put it in 1997) by the ‘heartfelt wish that Freud might never have been born or, failing to achieve that end, that all his works and influence be made as nothing’. Indeed, a basic inability to track down anyone with a dispassionate take on psychoanalysis was a frustration of researching this essay. The certitude that whatever I write here will enrage some readers hovers at the back of my mind as I think ahead to skimming the comments section. Preserve subjectivity, I thought, fine, I’m onboard. But why not eschew the heavily contested Freudianism for the psychotherapy of Irvin D Yalom, which takes an existentialist view of the basic challenges of life? Why not embrace Viktor Frankl’s logotherapy, which prioritises our fundamental desire to give life meaning, or the philosophical tradition of phenomenology, whose first principle is that subjectivity precedes all else?

Within neuropsychoanalysis, though, Freud symbolises the fact that, to quote the neuroscientist Ramachandran’s Phantoms in the Brain (1998), you can ‘look for laws of mental life in much the same way that a cardiologist might study the heart or an astronomer study planetary motion’. And on the clinical side, it is simply a fact that before Freud there was really no such thing as therapy, as we understand that word today. In Yalom’s novel When Nietzsche Wept (1992), Josef Breuer, Freud’s mentor, is at a loss for how to counsel the titular German philosopher out of his despair: ‘There is no medicine for despair, no doctor for the soul,’ he says. All Breuer can recommend are therapeutic spas, ‘or perhaps a talk with a priest’.

The article is here.

Saturday, November 25, 2017

Rather than being free of values, good science is transparent about them

Kevin Elliott
The Conversation
Originally published November 8, 2017

Scientists these days face a conundrum. As Americans are buffeted by accounts of fake news, alternative facts and deceptive social media campaigns, how can researchers and their scientific expertise contribute meaningfully to the conversation?

There is a common perception that science is a matter of hard facts and that it can and should remain insulated from the social and political interests that permeate the rest of society. Nevertheless, many historians, philosophers and sociologists who study the practice of science have come to the conclusion that trying to kick values out of science risks throwing the baby out with the bathwater.

Ethical and social values – like the desire to promote economic development, public health or environmental protection – often play integral roles in scientific research. By acknowledging this, scientists might seem to give away their authority as a defense against the flood of misleading, inaccurate information that surrounds us. But I argue in my book “A Tapestry of Values: An Introduction to Values in Science” that if scientists take appropriate steps to manage and communicate about their values, they can promote a more realistic view of science as both value-laden and reliable.

The article is here.

Wednesday, November 22, 2017

Many Academics Are Eager to Publish in Worthless Journals

Gina Kolata
The New York Times
Originally published October 30, 2017

Here is an excerpt:

Yet “every university requires some level of publication,” said Lawrence DiPaolo, vice president of academic affairs at Neumann University in Aston, Pa.

Recently a group of researchers invented a fake academic: Anna O. Szust. The name in Polish means fraudster. Dr. Szust applied to legitimate and predatory journals asking to be an editor. She supplied a résumé in which her publications and degrees were total fabrications, as were the names of the publishers of the books she said she had contributed to.

The legitimate journals rejected her application immediately. But 48 out of 360 questionable journals made her an editor. Four made her editor in chief. One journal sent her an email saying, “It’s our pleasure to add your name as our editor in chief for the journal with no responsibilities.”

The lead author of the Dr. Szust sting operation, Katarzyna Pisanski, a psychologist at the University of Sussex in England, said the question of what motivates people to publish in such journals “is a touchy subject.”

“If you were tricked by spam email you might not want to admit it, and if you did it wittingly to increase your publication counts you might also not want to admit it,” she said in an email.

The consequences of participating can be more than just a résumé freckled with poor-quality papers and meeting abstracts.

Publications become part of the body of scientific literature.

There are indications that some academic institutions are beginning to wise up to the dangers.

Dewayne Fox, an associate professor of fisheries at Delaware State University, sits on a committee at his school that reviews job applicants. One recent applicant, he recalled, listed 50 publications in such journals and is on the editorial boards of some of them.

A few years ago, he said, no one would have noticed. But now he and others on search committees at his university have begun scrutinizing the publications closely to see if the journals are legitimate.

The article is here.

Monday, November 20, 2017

Why we pretend to know things, explained by a cognitive scientist

Sean Illing
Vox.com
Originally posted November 3, 2017

Why do people pretend to know things? Why does confidence so often scale with ignorance? Steven Sloman, a professor of cognitive science at Brown University, has some compelling answers to these questions.

“We're biased to preserve our sense of rightness,” he told me, “and we have to be.”

The author of The Knowledge Illusion: Why We Never Think Alone, Sloman’s research focuses on judgment, decision-making, and reasoning. He’s especially interested in what’s called “the illusion of explanatory depth.” This is how cognitive scientists refer to our tendency to overestimate our understanding of how the world works.

We do this, Sloman says, because of our reliance on other minds.

“The decisions we make, the attitudes we form, the judgments we make, depend very much on what other people are thinking,” he said.

If the people around us are wrong about something, there’s a good chance we will be too. Proximity to truth compounds in the same way.

In this interview, Sloman and I talk about the problem of unjustified belief. I ask him about the political implications of his research, and if he thinks the rise of “fake news” and “alternative facts” has amplified our cognitive biases.

The interview/article is here.

Thursday, September 28, 2017

What’s Wrong With Voyeurism?

David Boonin
What's Wrong?
Originally posted August 31, 2017

The publication last year of The Voyeur’s Motel, Gay Talese’s controversial account of a Denver area motel owner who purportedly spent several decades secretly observing the intimate lives of his customers, raised a number of difficult ethical questions.  Here I want to focus on just one: does the peeping Tom who is never discovered harm his victims?

The peeping Tom profiled in Talese’s book certainly doesn’t think so.  In an excerpt that appeared in the New Yorker in advance of the book’s publication, Talese reports that Gerald Foos, the proprietor in question, repeatedly insisted that his behavior was “harmless” on the grounds that his “guests were unaware of it.”  Talese himself does not contradict the subject of his account on this point, and Foos’s assertion seems to be grounded in a widely accepted piece of conventional wisdom, one that often takes the form of the adage that “what you don’t know can’t hurt you”.  But there’s a problem with this view of harm, and thus a problem with the view that voyeurism, when done successfully, is a harmless vice.

The blog post is here.

Wednesday, August 16, 2017

What Does Patient Autonomy Mean for Doctors and Drug Makers?

Christina Sandefur
The Conversation
Originally published July 26, 2017

Here is an excerpt:

Although Bateman-House fears that deferring to patients comes at the expense of physician autonomy, she also laments that physicians currently abuse the freedom they have, failing to spend enough time with their patients, which she says undermines a patient’s ability to make informed medical decisions.

Even if it’s true that physician consultations aren’t as thorough as they once were, patients today have better access to health care information than ever before. According to the Pew Research Center, two-thirds of U.S. adults have broadband internet in their homes, and 13 percent who lack it can access the internet through a smartphone. Pew reports that more than half of adult internet users go online to get information on medical conditions, 43 percent on treatments, and 16 percent on drug safety. Yet despite their desire to research these issues online, 70 percent still sought out additional information from a doctor or other professional.

In other words, people are making greater efforts to learn about health care on their own. True, not all such information on the internet is accurate. But encouraging patients to seek out information from multiple sources is a good thing. In fact, requiring government approval of treatments may lull patients into a false sense of security. As Connor Boyack, president of the Libertas Institute, points out, “Instead of doing their own due diligence and research, the overwhelming majority of people simply concern themselves with whether or not the FDA says a certain product is okay to use.” But blind reliance on a government bureaucracy is rarely a good idea.

The article can be found here.

Thursday, June 22, 2017

Teaching Humility in an Age of Arrogance

Michael Patrick Lynch
The Chronicle of Higher Education
Originally published June 5, 2017

Here is an excerpt:

Our cultural embrace of epistemic or intellectual arrogance is the result of a toxic mix of technology, psychology, and ideology. To combat it, we have to reconnect with some basic values, including ones that philosophers have long thought were essential both to serious intellectual endeavors and to politics.

One of those ideas, as I just noted, is belief in objective truth. But another, less-noted concept is intellectual humility. By intellectual humility, I refer to a cluster of attitudes that we can take toward ourselves — recognizing your own fallibility, realizing that you don’t really know as much as you think, and owning your limitations and biases.

But being intellectually humble also means taking an active stance. It means seeing your worldview as open to improvement by the evidence and experience of other people. Being open to improvement is more than just being open to change. And it isn’t just a matter of self-improvement — using your genius to know even more. It is a matter of seeing your view as capable of improvement because of what others contribute.

Intellectual humility is not the same as skepticism. Improving your knowledge must start from a basis of rational conviction. That conviction allows you to know when to stop inquiring, when to realize that you know enough — that the earth really is round, the climate is warming, the Holocaust happened, and so on. That, of course, is tricky, and many a mistake in science and politics have been made because someone stopped inquiring before they should have. Hence the emphasis on evidence; being intellectually humble requires being responsive to the actual evidence, not to flights of fancy or conspiracy theories.

The article is here.

Monday, May 15, 2017

Cassandra’s Regret: The Psychology of Not Wanting to Know

Gigerenzer, Gerd; Garcia-Retamero, Rocio
Psychological Review, Vol 124(2), Mar 2017, 179-196.

Abstract

Ignorance is generally pictured as an unwanted state of mind, and the act of willful ignorance may raise eyebrows. Yet people do not always want to know, demonstrating a lack of curiosity at odds with theories postulating a general need for certainty, ambiguity aversion, or the Bayesian principle of total evidence. We propose a regret theory of deliberate ignorance that covers both negative feelings that may arise from foreknowledge of negative events, such as death and divorce, and positive feelings of surprise and suspense that may arise from foreknowledge of positive events, such as knowing the sex of an unborn child. We conduct the first representative nationwide studies to estimate the prevalence and predictability of deliberate ignorance for a sample of 10 events. Its prevalence is high: Between 85% and 90% of people would not want to know about upcoming negative events, and 40% to 70% prefer to remain ignorant of positive events. Only 1% of participants consistently wanted to know. We also deduce and test several predictions from the regret theory: Individuals who prefer to remain ignorant are more risk averse and more frequently buy life and legal insurance. The theory also implies the time-to-event hypothesis, which states that for the regret-prone, deliberate ignorance is more likely the nearer the event approaches. We cross-validate these findings using 2 representative national quota samples in 2 European countries. In sum, we show that deliberate ignorance exists, is related to risk aversion, and can be explained as avoiding anticipatory regret.



The article is here.

Sunday, April 30, 2017

Why Expertise Matters

Adam Frank
npr.org
Originally posted on April 7, 2017

Here is an excerpt:

The attack on expertise was given its most visceral form by British politician Michael Gove during the Brexit campaign last year when he famously claimed, "people in this country have had enough of experts." The same kinds of issues, however, are also at stake here in the U.S. in our discussions about "alternative facts," "fake news" and "denial" of various kinds. That issue can be put as a simple question: When does one opinion count more than another?

By definition, an expert is someone whose learning and experience lets them understand a subject deeper than you or I do (assuming we're not an expert in that subject, too). The weird thing about having to write this essay at all is this: Who would have a problem with that? Doesn't everyone want their brain surgery done by an expert surgeon rather than the guy who fixes their brakes? On the other hand, doesn't everyone want their brakes fixed by an expert auto mechanic rather than a brain surgeon who has never fixed a flat?

Every day, all of us entrust our lives to experts from airline pilots to pharmacists. Yet, somehow, we've come to a point where people can put their ignorance on a subject of national importance on display for all to see — and then call it a virtue.

Here at 13.7, we've seen this phenomenon many times. When we had a section for comments, it would quickly fill up with statements like "the climate is always changing" or "CO2 is a trace gas so it doesn't matter" when we a posted pieces on the science of climate change.

The article is here.

Tuesday, March 28, 2017

Why We Believe Obvious Untruths

Philip Fernbach & Steven Sloman
The New York Times
Originally published March 3, 2017

'How can so many people believe things that are demonstrably false? The question has taken on new urgency as the Trump administration propagates falsehoods about voter fraud, climate change and crime statistics that large swaths of the population have bought into. But collective delusion is not new, nor is it the sole province of the political right. Plenty of liberals believe, counter to scientific consensus, that G.M.O.s are poisonous, and that vaccines cause autism.

The situation is vexing because it seems so easy to solve. The truth is obvious if you bother to look for it, right? This line of thinking leads to explanations of the hoodwinked masses that amount to little more than name calling: “Those people are foolish” or “Those people are monsters.”

Such accounts may make us feel good about ourselves, but they are misguided and simplistic: They reflect a misunderstanding of knowledge that focuses too narrowly on what goes on between our ears. Here is the humbler truth: On their own, individuals are not well equipped to separate fact from fiction, and they never will be. Ignorance is our natural state; it is a product of the way the mind works.

What really sets human beings apart is not our individual mental capacity. The secret to our success is our ability to jointly pursue complex goals by dividing cognitive labor. Hunting, trade, agriculture, manufacturing — all of our world-altering innovations — were made possible by this ability. Chimpanzees can surpass young children on numerical and spatial reasoning tasks, but they cannot come close on tasks that require collaborating with another individual to achieve a goal. Each of us knows only a little bit, but together we can achieve remarkable feats.

Saturday, March 4, 2017

How ‘Intellectual Humility’ Can Make You a Better Person

Cindy Lamothe
The Science of Us
Originally posted February 3, 2017

There’s a well-known Indian parable about six blind men who argue at length about what an elephant feels like. Each has a different idea, and each holds fast to his own view. “It’s like a rope,” says the man who touched the tail. “Oh no, it’s more like the solid branch of a tree,” contends the one who touched the trunk. And so on and so forth, and round and round they go.

The moral of the story: We all have a tendency to overestimate how much we know — which, in turn, means that we often cling stubbornly to our beliefs while tuning out opinions different from our own. We generally believe we’re better or more correct than everyone else, or at least better than most people — a psychological quirk that’s as true for politics and religion as it is for things like fashion and lifestyles. And in a time when it seems like we’re all more convinced than ever of our own rightness, social scientists have begun to look more closely at an antidote: a concept called intellectual humility.

Unlike general humility — which is defined by traits like sincerity, honesty, and unselfishness — intellectual humility has to do with understanding the limits of one’s knowledge. It’s a state of openness to new ideas, a willingness to be receptive to new sources of evidence, and it comes with significant benefits: People with intellectual humility are both better learners and better able to engage in civil discourse. Google’s VP in charge of hiring, Laszlo Bock, has claimed it as one of the top qualities he looks for in a candidate: Without intellectual humility, he has said, “you are unable to learn.”

The article is here.