Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Intellectual Humility. Show all posts
Showing posts with label Intellectual Humility. Show all posts

Saturday, September 16, 2023

A Metacognitive Blindspot in Intellectual Humility Measures

Costello, T. H., Newton, C., Lin, H., & Pennycook, G.
(2023, August 6).

Abstract

Intellectual humility (IH) is commonly defined as recognizing the limits of one’s knowledge and abilities. However, most research has relied entirely on self-report measures of IH, without testing whether these instruments capture the metacognitive core of the construct. Across two studies (Ns = 898; 914), using generalized additive mixed models to detect complex non-linear interactions, we evaluated the correspondence between widely used IH self-reports and performance on calibration and resolution paradigms designed to model the awareness of one’s mental capabilities (and their fallibility). On an overconfidence paradigm (N observations per model = 2,692-2,742), none of five IH measures attenuated the Dunning-Kruger effect, whereby poor performers overestimate their abilities and high performers underestimate them. On a confidence-accuracy paradigm (Nobservation per model = 7,223 - 12,706), most IH measures were associated with inflated confidence regardless of accuracy, or were specifically related to confidence when participants were correct but not when they were incorrect. The sole exception was the “Lack of Intellectual Overconfidence” subscale of the Comprehensive Intellectual Humility Scale, which uniquely predicted lower confidence for incorrect responses. Meanwhile, measures of Actively Open-minded Thinking reliably predicted calibration and resolution. These findings reveal substantial discrepancies between IH self-reports and metacognitive abilities, suggesting most IH measures lack validity. It may not be feasible to assess IH via self-report–as indicating a great deal of humility may, itself, be a sign of a failure in humility.

GeneralDiscussion

IH represents the ability to identify the constraints of one’s psychological, epistemic, and cultural perspective— to conduct lay phenomenology, acknowledging that the default human perspective is (literally) self-centered (Wallace, 2009) — and thereby cultivate an awareness of the limits of a single person, theory, or ideology to describe the vast and searingly complex universe. It is a process that presumably involves effortful and vigilant noticing – tallying one’s epistemic track record, and especially one’s fallibility (Ballantyne, 2021).

IH, therefore, manifests dynamically in individuals as a boundary between one’s informational environment and one’s model of reality. This portrait of IH-as-boundary appears repeatedly in philosophical and psychological treatments of IH, which frequently frame awareness of (epistemic) limitations as IH’s conceptual, metacognitive core (Leary et al., 2017; Porter, Elnakouri, et al., 2022). Yet as with a limit in mathematics, epistemic limits are appropriately defined as functions: their value is dependent on inputs (e.g., information environment, access to knowledge) that vary across contexts and individuals. Particularly, measuring IH requires identifying at least two quantities— one’s epistemic capabilities and one’s appraisal of said capabilities— from which a third, IH-qua-metacognition, can be derived as the distance between the two quantities.

Contemporary IH self-reports tend not to account for either parameter, seeming to rest instead on an auxiliary assumption: That people who are attuned to, and “own”, their epistemic limitations will generate characteristic, intellectually humble patterns of thinking and behavior. IH questionnaires then target these patterns, rather than the shared propensity for IH which the patterns ostensibly reflect.

We sought to both test and circumvent this assumption (and mono-method measurement limitation) in the present research. We did so by defining IH’s metacognitive core, functionally and statistically, in terms of calibration and resolution. We operationalized calibration as the convergence between participants’ performance on a series of epistemic tasks, on the one hand, and participants’ estimation of their own performance, on the other. Given that the relation between self-estimation and actual performance is non-linear (i.e., the Dunning-Kruger effect), there were several pathways by which IH might predict calibration: (1) decreased overestimation among low performers, (2) decreased underestimation among high performers, or (3) unilateral weakening of miscalibration among both low and high performers (for a visual representation, refer to Figure 1). Further, we operationalized epistemic resolution by assessing the relation between IH, on the one hand, individuals’ item-by-item confidence judgments for correct versus incorrect answers, on the other hand. Thus, resolution represents the capacity to distinguish between one’s correct and incorrect judgments and beliefs (a seemingly necessary prerequisite for building an accurate and calibrated model of one’s knowledge).

Sunday, June 12, 2022

You Were Right About COVID, and Then You Weren’t

Olga Khazan
The Atlantic
Originally posted 3 MAY 22

Here are two excerpts:

Tenelle Porter, a psychologist at UC Davis, studies so-called intellectual humility, or the recognition that we have imperfect information and thus our beliefs might be wrong. Practicing intellectual humility, she says, is harder when you’re very active on the internet, or when you’re operating in a cutthroat culture. That might be why it pains me—a very online person working in the very competitive culture of journalism—to say that I was incredibly wrong about COVID at first. In late February 2020, when Smith was sounding the alarm among his co-workers, I had drinks with a colleague who asked me if I was worried about “this new coronavirus thing.”

“No!” I said. After all, I had covered swine flu, which blew over quickly and wasn’t very deadly.

A few days later, my mom called and asked me the same question. “People in Italy are staying inside their houses,” she pointed out.

“Yeah,” I said. “But SARS and MERS both stayed pretty localized to the regions they originally struck.”

Then, a few weeks later, when we were already working from home and buying dried beans, a friend asked me if she should be worried about her wedding, which was scheduled for October 2020.

“Are you kidding?” I said. “They will have figured out a vaccine or something by then.” Her wedding finally took place this month.

(cut)

Thinking like a scientist, or a scout, means “recognizing that every single one of your opinions is a hypothesis waiting to be tested. And every decision you make is an experiment where you forgot to have a control group,” Grant said. The best way to hold opinions or make predictions is to determine what you think given the state of the evidence—and then decide what it would take for you to change your mind. Not only are you committing to staying open-minded; you’re committing to the possibility that you might be wrong.

Because the coronavirus has proved volatile and unpredictable, we should evaluate it as a scientist would. We can’t hold so tightly to prior beliefs that we allow them to guide our behavior when the facts on the ground change. This might mean that we lose our masks one month and don them again the next, or reschedule an indoor party until after case numbers decrease. It might mean supporting strict lockdowns in the spring of 2020 but not in the spring of 2022. It might even mean closing schools again, if a new variant seems to attack children. We should think of masks and other COVID precautions not as shibboleths but like rain boots and umbrellas, as Ashish Jha, the White House coronavirus-response coordinator, has put it. There’s no sense in being pro- or anti-umbrella. You just take it out when it’s raining.

Thursday, June 14, 2018

The Benefits of Admitting When You Don’t Know

Tenelle Porter
Behavioral Scientist
Originally published April 30, 2018

Here is an excerpt:

We found that the more intellectually humble students were more motivated to learn and more likely to use effective metacognitive strategies, like quizzing themselves to check their own understanding. They also ended the year with higher grades in math. We also found that the teachers, who hadn’t seen students’ intellectual humility questionnaires, rated the more intellectually humble students as more engaged in learning.

Next, we moved into the lab. Could temporarily boosting intellectual humility make people more willing to seek help in an area of intellectual weakness? We induced intellectual humility in half of our participants by having them read a brief article that described the benefits of admitting what you do not know. The other half read an article about the benefits of being very certain of what you know. We then measured their intellectual humility.

Those who read the benefits-of-humility article self-reported higher intellectual humility than those in the other group. What’s more, in a follow-up exercise 85 percent of these same participants sought extra help for an area of intellectual weakness. By contrast, only 65 percent of the participants who read about the benefits of being certain sought the extra help that they needed. This experiment provided evidence that enhancing intellectual humility has the potential to affect students’ actual learning behavior.

Together, our findings illustrate that intellectual humility is associated with a host of outcomes that we think are important for learning in school, and they suggest that boosting intellectual humility may have benefits for learning.

The article is here.

Thursday, June 22, 2017

Teaching Humility in an Age of Arrogance

Michael Patrick Lynch
The Chronicle of Higher Education
Originally published June 5, 2017

Here is an excerpt:

Our cultural embrace of epistemic or intellectual arrogance is the result of a toxic mix of technology, psychology, and ideology. To combat it, we have to reconnect with some basic values, including ones that philosophers have long thought were essential both to serious intellectual endeavors and to politics.

One of those ideas, as I just noted, is belief in objective truth. But another, less-noted concept is intellectual humility. By intellectual humility, I refer to a cluster of attitudes that we can take toward ourselves — recognizing your own fallibility, realizing that you don’t really know as much as you think, and owning your limitations and biases.

But being intellectually humble also means taking an active stance. It means seeing your worldview as open to improvement by the evidence and experience of other people. Being open to improvement is more than just being open to change. And it isn’t just a matter of self-improvement — using your genius to know even more. It is a matter of seeing your view as capable of improvement because of what others contribute.

Intellectual humility is not the same as skepticism. Improving your knowledge must start from a basis of rational conviction. That conviction allows you to know when to stop inquiring, when to realize that you know enough — that the earth really is round, the climate is warming, the Holocaust happened, and so on. That, of course, is tricky, and many a mistake in science and politics have been made because someone stopped inquiring before they should have. Hence the emphasis on evidence; being intellectually humble requires being responsive to the actual evidence, not to flights of fancy or conspiracy theories.

The article is here.

Saturday, March 4, 2017

How ‘Intellectual Humility’ Can Make You a Better Person

Cindy Lamothe
The Science of Us
Originally posted February 3, 2017

There’s a well-known Indian parable about six blind men who argue at length about what an elephant feels like. Each has a different idea, and each holds fast to his own view. “It’s like a rope,” says the man who touched the tail. “Oh no, it’s more like the solid branch of a tree,” contends the one who touched the trunk. And so on and so forth, and round and round they go.

The moral of the story: We all have a tendency to overestimate how much we know — which, in turn, means that we often cling stubbornly to our beliefs while tuning out opinions different from our own. We generally believe we’re better or more correct than everyone else, or at least better than most people — a psychological quirk that’s as true for politics and religion as it is for things like fashion and lifestyles. And in a time when it seems like we’re all more convinced than ever of our own rightness, social scientists have begun to look more closely at an antidote: a concept called intellectual humility.

Unlike general humility — which is defined by traits like sincerity, honesty, and unselfishness — intellectual humility has to do with understanding the limits of one’s knowledge. It’s a state of openness to new ideas, a willingness to be receptive to new sources of evidence, and it comes with significant benefits: People with intellectual humility are both better learners and better able to engage in civil discourse. Google’s VP in charge of hiring, Laszlo Bock, has claimed it as one of the top qualities he looks for in a candidate: Without intellectual humility, he has said, “you are unable to learn.”

The article is here.