Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Overestimation. Show all posts
Showing posts with label Overestimation. Show all posts

Tuesday, June 14, 2022

Minority salience and the overestimation of individuals from minority groups in perception and memory

R. Kadosh, A. Y. Sklar, et al. 
PNAS (2022).
Vol 119 (12) 1-10.

Abstract

Our cognitive system is tuned toward spotting the uncommon and unexpected. We propose that individuals coming from minority groups are, by definition, just that—uncommon and often unexpected. Consequently, they are psychologically salient in perception, memory, and visual awareness. This minority salience creates a tendency to overestimate the prevalence of minorities, leading to an erroneous picture of our social environments—an illusion of diversity. In 12 experiments with 942 participants, we found evidence that the presence of minority group members is indeed overestimated in memory and perception and that masked images of minority group members are prioritized for visual awareness. These findings were consistent when participants were members of both the majority group and the minority group. Moreover, this overestimated prevalence of minorities led to decreased support for diversity-promoting policies. We discuss the theoretical implications of the illusion of diversity and how it may inform more equitable and inclusive decision-making.

Significance

Our minds are tuned to the uncommon or unexpected in our environment. In most environments, members of minority groups are just that—uncommon. Therefore, the cognitive system is tuned to spotting their presence. Our results indicate that individuals from minority groups are salient in perception, memory, and visual awareness. As a result, we consistently overestimate their presence—leading to an illusion of diversity: the environment seems to be more diverse than it actually is, decreasing our support for diversity-promoting measures. As we try to make equitable decisions, it is important that private individuals and decision-makers alike become aware of this biased perception. While these sorts of biases can be counteracted, one must first be aware of the bias.

Discussion

Taken together, our results from 12 experiments and 942 participants indicate that minority salience and overestimation are robust phenomena. We consistently overestimate the prevalence of individuals from minority groups and underestimate the prevalence of members from the majority group, thus perceiving our social environments as more diverse than they truly are. Our experiments also indicate that this effect maybe found at the level of priority for visual awareness and that it is social in nature: our social knowledge, our representation of the overall composition of our social environment, shapes this effect. Importantly, this illusion of diversity is consequential in that it leads to less support for measures to increase diversity.

Tuesday, September 22, 2020

How to be an ethical scientist

W. A. Cunningham, J. J. Van Bavel,
& L. H. Somerville
Science Magazine
Originally posted 5 August 20

True discovery takes time, has many stops and starts, and is rarely neat and tidy. For example, news that the Higgs boson was finally observed in 2012 came 48 years after its original proposal by Peter Higgs. The slow pace of science helps ensure that research is done correctly, but it can come into conflict with the incentive structure of academic progress, as publications—the key marker of productivity in many disciplines—depend on research findings. Even Higgs recognized this problem with the modern academic system: “Today I wouldn't get an academic job. It's as simple as that. I don't think I would be regarded as productive enough.”

It’s easy to forget about the “long view” when there is constant pressure to produce. So, in this column, we’re going to focus on the type of long-term thinking that advances science. For example, are you going to cut corners to get ahead, or take a slow, methodical approach? What will you do if your experiment doesn’t turn out as expected? Without reflecting on these deeper issues, we can get sucked into the daily goals necessary for success while failing to see the long-term implications of our actions.

Thinking carefully about these issues will not only impact your own career outcomes, but it can also impact others. Your own decisions and actions affect those around you, including your labmates, your collaborators, and your academic advisers. Our goal is to help you avoid pitfalls and find an approach that will allow you to succeed without impairing the broader goals of science.

Be open to being wrong

Science often advances through accidental (but replicable) findings. The logic is simple: If studies always came out exactly as you anticipated, then nothing new would ever be learned. Our previous theories of the world would be just as good as they ever were. This is why scientific discovery is often most profound when you stumble on something entirely new. Isaac Asimov put it best when he said, “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny ... .’”

The info is here.

Friday, July 6, 2018

People who think their opinions are superior to others are most prone to overestimating their relevant knowledge and ignoring chances to learn more

Tom Stafford
Blog Post: Research Digest
Originally posted May 31, 2018

Here is an excerpt:

Finally and more promisingly, the researchers found some evidence that belief superiority can be dented by feedback. If participants were told that people with beliefs like theirs tended to score poorly on topic knowledge, or if they were directly told that their score on the topic knowledge quiz was low, this not only reduced their belief superiority, it also caused them to seek out the kind of challenging information they had previously neglected in the headlines task (though the evidence for this behavioural effect was mixed).

The studies all involved participants accessed via Amazon’s Mechanical Turk, allowing the researchers to work with large samples of Americans for each experiment. Their findings mirror the well-known Dunning-Kruger effect – Kruger and Dunning showed that for domains such as judgments of grammar, humour or logic, the most skilled tend to underestimate their ability, while the least skilled overestimate it. Hall and Raimi’s research extends this to the realm of political opinions (where objective assessment of correctness is not available), showing that the belief your opinion is better than other people’s tends to be associated with overestimation of your relevant knowledge.

The article is here.

Friday, July 22, 2016

What This White-Collar Felon Can Teach You About Your Temptation To Cross That Ethical Line

Ron Carucci
Forbes.com
Originally posted June 28, 2016

The sobering truth of Law Professor Donald Langevoort’s words silenced the room like a loud mic-drop: “We’re not as ethical as we think we are.” Participants at Ethical Systems recent Ethics By Design conference were visibly uncomfortable…because they all knew it was true.

Research strongly indicates people over-estimate how strong their ethics are. I wanted to learn more about why genuinely honest people can be lured to cross lines they surely would have predicted, “I would never do that!”

The article is here.