Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Belief. Show all posts
Showing posts with label Belief. Show all posts

Thursday, April 6, 2023

People recognize and condone their own morally motivated reasoning

Cusimano, C., & Lombrozo, T. (2023).
Cognition, 234, 105379.

Abstract

People often engage in biased reasoning, favoring some beliefs over others even when the result is a departure from impartial or evidence-based reasoning. Psychologists have long assumed that people are unaware of these biases and operate under an “illusion of objectivity.” We identify an important domain of life in which people harbor little illusion about their biases – when they are biased for moral reasons. For instance, people endorse and feel justified believing morally desirable propositions even when they think they lack evidence for them (Study 1a/1b). Moreover, when people engage in morally desirable motivated reasoning, they recognize the influence of moral biases on their judgment, but nevertheless evaluate their reasoning as ideal (Studies 2–4). These findings overturn longstanding assumptions about motivated reasoning and identify a boundary condition on Naïve Realism and the Bias Blind Spot. People's tendency to be aware and proud of their biases provides both new opportunities, and new challenges, for resolving ideological conflict and improving reasoning.

Highlights

• Dominant theories assume people form beliefs only under an illusion of objectivity.

• We document a boundary condition on this illusion: morally desirable biases.

• People endorse beliefs they regard as evidentially weak but morally desirable.

• People realize when they have just engaged in morally motivated reasoning.

• Accurate self-attributions of moral bias fully attenuate the ‘bias blind spot’.

From the General discussion

Our beliefs about our beliefs – including whether they are biased or justified – play a crucial role in guiding inquiry, shaping belief revision, and navigating disagreement. One line of research suggests that these judgments are almost universally characterized by an illusion of objectivity such that people consciously reason with the goal of being objective and basing their beliefs on evidence, and because of this, people nearly always assume that their current beliefs meet those standards. Another line of work suggests that people sometimes think that values legitimately bear on whether someone is justified to hold a belief (Cusimano & Lombrozo, 2021b). These findings raise the possibility, consistent with some prior theoretical proposals (Cusimano & Lombrozo, 2021a; Tetlock, 2002), that people will knowingly violate norms of impartiality, or knowingly maintain beliefs that lack evidential support, when doing so advances what they consider to be morally laudable goals. Two predictions follow. First, people should evaluate their beliefs in part based on their perceived moral value. And second, in situations in which people engage in morally motivated reasoning, they should recognize that they have done so and should evaluate their morally motivated reasoning as appropriate. We document support for these predictions across four studies (Table 1).

Conclusion

A great deal of work has assumed that people treat objectivity and evidence-based reasoning as cardinal norms governing their belief formation. This assumption has grown increasingly tenuous in light of recent work highlighting the importance of moral concerns in almost all facets of life. Consistent with this recent work, we find evidence that people’s evaluations of the moral quality of a proposition predict their subjective confidence that it is true, their likelihood of claiming that they believe it and know it, and the extent to which they take their belief to be justified. Moreover, people exhibit metacognitive awareness of this fact and approve of morality’s influence on their reasoning. People often want to be right, but they also want to be good – and they know it.

Wednesday, July 20, 2022

Knowledge before belief

Phillips, J., Buckwalter, W. et al. (2021)
Behavioral and Brain Sciences, 44, E140.
doi:10.1017/S0140525X20000618

Abstract

Research on the capacity to understand others' minds has tended to focus on representations of beliefs, which are widely taken to be among the most central and basic theory of mind representations. Representations of knowledge, by contrast, have received comparatively little attention and have often been understood as depending on prior representations of belief. After all, how could one represent someone as knowing something if one does not even represent them as believing it? Drawing on a wide range of methods across cognitive science, we ask whether belief or knowledge is the more basic kind of representation. The evidence indicates that nonhuman primates attribute knowledge but not belief, that knowledge representations arise earlier in human development than belief representations, that the capacity to represent knowledge may remain intact in patient populations even when belief representation is disrupted, that knowledge (but not belief) attributions are likely automatic, and that explicit knowledge attributions are made more quickly than equivalent belief attributions. Critically, the theory of mind representations uncovered by these various methods exhibits a set of signature features clearly indicative of knowledge: they are not modality-specific, they are factive, they are not just true belief, and they allow for representations of egocentric ignorance. We argue that these signature features elucidate the primary function of knowledge representation: facilitating learning from others about the external world. This suggests a new way of understanding theory of mind – one that is focused on understanding others' minds in relation to the actual world, rather than independent from it.

From the last section

Learning from others, cultural evolution, and what is special about humans

A capacity for reliably learning from others is critically important not only within a single lifespan, but also across them—at the level of human societies. Indeed, this capacity to reliably learn from others has been argued to be essential for human’s unique success in the accumulation and transmission of cultural knowledge (e.g., Henrich, 2015; Heyes, 2018). Perhaps unsurprisingly, the argument we’ve made about the primary role of knowledge representations in cognition fits nicely with this broad view of why humans have been so successful: it is likely supported by our comparatively basic theory of mind representations.

At the same time, this suggestion cuts against another common proposal for which ability underwrites the wide array of ways in which humans have been uniquely successful, namely their ability to represent others’ beliefs (Baron-Cohen, 1999; Call & Tomasello, 2008; Pagel, 2012; Povinelli & Preuss, 1995; Tomasello 1999; Tomasello, et al., 1993). While the ability to represent others’ beliefs may indeed turn out to be unique to humans and critically important for some purposes, it does not seem to underwrite humans’ capacity for the accumulation of cultural knowledge. After all, precisely at the time in human development when the vast majority of critical learning occurs (infancy and early childhood), we find robust evidence for a capacity for knowledge rather than belief representation (§4.2).

Thursday, September 2, 2021

Reconciling scientific and commonsense values to improve reasoning

C. Cusimano & T. Lombrozo
Trends in Cognitive Sciences
Available online July 2021

Abstract

Scientific reasoning is characterized by commitments to evidence and objectivity. New research suggests that under some conditions, people are prone to reject these commitments, and instead sanction motivated reasoning and bias. Moreover, people’s tendency to devalue scientific reasoning likely explains the emergence and persistence of many biased beliefs. However, recent work in epistemology has identified ways in which bias might be legitimately incorporated into belief formation. Researchers can leverage these insights to evaluate when commonsense affirmation of bias is justified and when it is unjustified and therefore a good target for intervention.

Highlights
  • People espouse a ‘lay ethics of belief’ that defines standards for how beliefs should be evaluated and formed.
  • People vary in the extent to which they endorse scientific norms of reasoning, such as evidentialism and impartiality, in their own norms of belief. In some cases, people sanction motivated or biased thinking.
  • Variation in endorsement of scientific norms predicts belief accuracy, suggesting that interventions that target norms could lead to more accurate beliefs.
  • Normative theories in epistemology vary in whether, and how, they regard reasoning and belief formation as legitimately impacted by moral or pragmatic considerations.
  • Psychologists can leverage knowledge of people’s lay ethics of belief, and normative arguments about when and whether bias is appropriate, to develop interventions to improve reasoning that are both ethical and effective.

Concluding remarks

It is no secret that humans are biased reasoners. Recent work suggests that these departures from scientific reasoning are not simply the result of unconscious bias, but are also a consequence of endorsing norms for belief that place personal, moral, or social good above truth.  The link between devaluing the ‘scientific ethos’ and holding biased beliefs suggests that, in some cases, interventions on the perceived value of scientific reasoning could lead to better reasoning and to better outcomes. In this spirit, we have offered a strategy for value debiasing.

Wednesday, May 20, 2020

People judge others to have more control over beliefs than they themselves do.

Cusimano, C., & Goodwin, G. (2020, April 3).
https://doi.org/10.1037/pspa0000198

Abstract

People attribute considerable control to others over what those individuals believe. However, no work to date has investigated how people judge their own belief control, nor whether such judgments diverge from their judgments of others. We addressed this gap in seven studies and found that people judge others to be more able to voluntarily change what they believe than they themselves are. This occurs when people judge others who disagree with them (Study 1) as well as others agree with them (Studies 2-5, 7), and it occurs when people judge strangers (Studies 1-2, 4-5) as well as close others (Studies 3, 7). It appears not to be explained by impression management or self-enhancement motives (Study 3). Rather, there is a discrepancy between the evidentiary constraints on belief change that people access via introspection, and their default assumptions about the ease of voluntary belief revision. That is, people spontaneously tend to think about the evidence that supports their beliefs, which leads them to judge their beliefs as outside their control. But they apparently fail to generalize this feeling of constraint to others, and similarly fail to incorporate it into their generic model of beliefs (Studies 4-7). We discuss the implications of our findings for theories of ideology-based conflict, actor-observer biases, naïve realism, and on-going debates regarding people’s actual capacity to voluntarily change what they believe.

Conclusion

The  present  paper  uncovers  an  important  discrepancy in  how  people  think  about  their  own  and  others’  beliefs; namely, that people judge that others have a greater capacity to voluntarily change their beliefs than they, themselves do.  Put succinctly, when someone says, “You can choose to believe in God, or you can choose not to believe in God,” they may often mean that you can choose but they cannot.  We have argued that this discrepancy derives from two distinct ways people reason about belief control: either by consulting their default theory of belief, or by introspecting and reporting what they feel when they consider voluntarily changing a belief. When people apply their default theory of belief, they judge  that  they  and  others  have  considerable  control  over what they believe. But, when people consider the possibility of trying to change a particular belief, they tend to report that they have less control. Because people do not have access to the experiences of others, they rely on their generic theory of beliefs when judging others’ control. Discrepant attributions of control for self and other emerge as a result.  This may in turn have important downstream effects on people’s behavior during disagreements. More work is needed to explore these downstream effects, as well as to understand how much control people actually have over what they believe.  Predictably,we find the results from these studies compelling, but admit that readers may believe whatever they please.

The research is here.

Thursday, February 27, 2020

The cultural evolution of prosocial religions

Norenzayan, A., and others.
(2016). Behavioral and Brain Sciences, 39, E1.
doi:10.1017/S0140525X14001356

Abstract

We develop a cultural evolutionary theory of the origins of prosocial religions and apply it to resolve two puzzles in human psychology and cultural history: (1) the rise of large-scale cooperation among strangers and, simultaneously, (2) the spread of prosocial religions in the last 10–12 millennia. We argue that these two developments were importantly linked and mutually energizing. We explain how a package of culturally evolved religious beliefs and practices characterized by increasingly potent, moralizing, supernatural agents, credible displays of faith, and other psychologically active elements conducive to social solidarity promoted high fertility rates and large-scale cooperation with co-religionists, often contributing to success in intergroup competition and conflict. In turn, prosocial religious beliefs and practices spread and aggregated as these successful groups expanded, or were copied by less successful groups. This synthesis is grounded in the idea that although religious beliefs and practices originally arose as nonadaptive by-products of innate cognitive functions, particular cultural variants were then selected for their prosocial effects in a long-term, cultural evolutionary process. This framework (1) reconciles key aspects of the adaptationist and by-product approaches to the origins of religion, (2) explains a variety of empirical observations that have not received adequate attention, and (3) generates novel predictions. Converging lines of evidence drawn from diverse disciplines provide empirical support while at the same time encouraging new research directions and opening up new questions for exploration and debate.

The paper is here.

Tuesday, November 13, 2018

Delusions and Three Myths of Irrational Belief

Bortolotti L. (2018) Delusions and Three Myths of Irrational Belief.
In: Bortolotti L. (eds) Delusions in Context. Palgrave Macmillan, Cham

Abstract

This chapter addresses the contribution that the delusion literature has made to the philosophy of belief. Three conclusions will be drawn: (1) a belief does not need to be epistemically rational to be used in the interpretation of behaviour; (2) a belief does not need to be epistemically rational to have significant psychological or epistemic benefits; (3) beliefs exhibiting the features of epistemic irrationality exemplified by delusions are not infrequent, and they are not an exception in a largely rational belief system. What we learn from the delusion literature is that there are complex relationships between rationality and interpretation, rationality and success, and rationality and knowledge.

The chapter is here.

Here is a portion of the Conclusion:

Second, it is not obvious that epistemically irrational beliefs should be corrected, challenged, or regarded as a glitch in an otherwise rational belief system. The whole attitude towards such beliefs should change. We all have many epistemically irrational beliefs, and they are not always a sign that we lack credibility or we are mentally unwell. Rather, they are predictable features of human cognition (Puddifoot and Bortolotti, 2018). We are not unbiased in the way we weigh up evidence and we tend to be conservative once we have adopted a belief, making it hard for new contrary evidence to unsettle our existing convictions. Some delusions are just a vivid illustration of a general tendency that is widely shared and hard to counteract. Delusions, just like more common epistemically irrational beliefs, may be a significant obstacle to the achievements of our goals and may cause a rift between our way of seeing the world and other people’s way. That is why it is important to develop a critical attitude towards their content.

Tuesday, September 4, 2018

Belief in God: Why People Believe, and Why They Don’t

Brett Mercier, , Stephanie R. Kramer, Azim F. Shariff
Current Directions in Psychological Science
First Published July 31, 2018

Abstract

Belief in a god or gods is a central feature in the lives of billions of people and a topic of perennial interest within psychology. However, research over the past half decade has achieved a new level of understanding regarding both the ultimate and proximate causes of belief in God. Ultimate causes—the evolutionary influences on a trait—shed light on the adaptive value of belief in God and the reasons why a tendency toward this belief exists in humans. Proximate causes—the immediate influences on the expression of a trait—explain variation and changes in belief. We review this research and discuss remaining barriers to a fuller understanding of belief in God.

The article is here.


Friday, December 25, 2015

Scientific Faith Is Different From Religious Faith

By Paul Bloom
The Atlantic
Originally published November 24, 2015

Here is an excerpt:

It’s better to get a cancer diagnosis from a radiologist than from a Ouija Board. It’s better to learn about the age of the universe from an astrophysicist than from a Rabbi. The New England Journal of Medicine is a more reliable source about vaccines than the actress Jenny McCarthy. These preferences are not ideological. We’re not talking about Fox News versus The Nation. They are rational, because the methods of science are demonstrably superior at getting at truths about the natural world.

I don’t want to fetishize science. Sociologists and philosophers deserve a lot of credit in reminding us that scientific practice is permeated by groupthink, bias, and financial, political, and personal motivations. The physicist Richard Feynman once wrote that the essence of science was “bending over backwards to prove ourselves wrong.” But he was talking about the collective cultural activity of science, not scientists as individuals, most of whom prefer to be proven right, and who are highly biased to see the evidence in whatever light most favors their preferred theory.

The entire article is here.

Friday, December 18, 2015

The Centrality of Belief and Reflection in Knobe Effect Cases: A Unified Account of the Data

By Mark Alfano, James R. Beebe, and Brian Robinson
The Monist
April 2012

Abstract

Recent work in experimental philosophy has shown that people are more likely to attribute intentionality, knowledge, and other psychological properties to someone who causes a bad side-effect than to someone who causes a good one. We argue that all of these asymmetries can be explained in terms of a single underlying asymmetry involving belief attribution because the  belief that one’s action would result in a certain side-effect is a necessary component of each of the psychological attitudes in question. We argue further that this belief-attribution asymmetry is rational because it mirrors a belief-formation asymmetry and that the belief-formation asymmetry is also rational because it is more useful to form some beliefs than others.

Friday, December 11, 2015

Why do we intuitively believe we have free will?

By Tom Stafford
BBC.com
Originally published 7 August 2015

It is perhaps the most famous experiment in neuroscience. In 1983, Benjamin Libet sparked controversy with his demonstration that our sense of free will may be an illusion, a controversy that has only increased ever since.

Libet’s experiment has three vital components: a choice, a measure of brain activity and a clock.
The choice is to move either your left or right arm. In the original version of the experiment this is by flicking your wrist; in some versions of the experiment it is to raise your left or right finger. Libet’s participants were instructed to “let the urge [to move] appear on its own at any time without any pre-planning or concentration on when to act”. The precise time at which you move is recorded from the muscles of your arm.

The article is here.

Tuesday, September 15, 2015

Explanatory Judgment, Moral Offense and Value-Free Science

By Matteo Colombo, Leandra Bucher, & Yoel Inbar
Review of Philosophy and Psychology
August 2015

Abstract

A popular view in philosophy of science contends that scientific reasoning is objective to the extent that the appraisal of scientific hypotheses is not influenced by moral, political, economic, or social values, but only by the available evidence. A large body of results in the psychology of motivated-reasoning has put pressure on the empirical adequacy of this view. The present study extends this body of results by providing direct evidence that the moral offensiveness of a scientific hypothesis biases explanatory judgment along several dimensions, even when prior credence in the hypothesis is controlled for. Furthermore, it is shown that this bias is insensitive to an economic incentive to be accurate in the evaluation of the evidence. These results contribute to call into question the attainability of the ideal of a value-free science.

The entire article is here.

Wednesday, August 5, 2015

What would I eliminate if I had a magic wand? Overconfidence’

The psychologist and bestselling author of Thinking, Fast and Slow reveals his new research and talks about prejudice, fleeing the Nazis, and how to hold an effective meeting

By David Shariatmadari
The Guardian
Originally posted on July 18, 2015

Here is an excerpt:

What’s fascinating is that Kahneman’s work explicitly swims against the current of human thought. Not even he believes that the various flaws that bedevil decision-making can be successfully corrected. The most damaging of these is overconfidence: the kind of optimism that leads governments to believe that wars are quickly winnable and capital projects will come in on budget despite statistics predicting exactly the opposite. It is the bias he says he would most like to eliminate if he had a magic wand. But it “is built so deeply into the structure of the mind that you couldn’t change it without changing many other things”.

The entire article is here.

Tuesday, December 30, 2014

The Dark Side of Free Will

Published on Dec 9, 2014

This talk was given at a local TEDx event, produced independently of the TED Conferences. What would happen if we all believed free will didn't exist? As a free will skeptic, Dr. Gregg Caruso contends our society would be better off believing there is no such thing as free will.

Tuesday, December 2, 2014

Attributions to God and Satan About Life-Altering Events.

Ray, Shanna D.; Lockman, Jennifer D.; Jones, Emily J.; Kelly, Melanie H.
Psychology of Religion and Spirituality, Sep 22 , 2014, No Pagination Specified. http://dx.doi.org/10.1037/a0037884

Abstract

When faced with negative life events, people often interpret the events by attributing them to the actions of God or Satan (Lupfer, Tolliver, & Jackson, 1996; Ritzema, 1979). To explore these attributions, we conducted a mixed-method study of Christians who were college freshmen. Participants read vignettes depicting a negative life event that had a beginning and an end that was systematically varied. Participants assigned a larger role to God in vignettes where an initially negative event (e.g., relationship breakup) led to a positive long-term outcome (e.g., meeting someone better) than with a negative (e.g., depression and loneliness) or unspecified long-term outcome. Participants attributed a lesser role to Satan when there was positive outcome rather than negative or unspecified outcome. Participants also provided their own narratives, recounting personal experiences that they attributed to the actions of God or Satan. Participant-supplied narratives often demonstrated “theories” about the actions of God, depicting God as being involved in negative events as a rescuer, comforter, or one who brings positive out of the negative. Satan-related narratives were often lacking in detail or a clear theory of how Satan worked. Participants who did provide this information depicted Satan as acting primarily through influencing one’s thoughts and/or using other people to encourage one’s negative behavior.

The entire article is here.

Thursday, November 6, 2014

Does Everything Happen for a Reason?

By Konika Banerjee and Paul Bloom
The New York Times Sunday Review
Originally posted on October 17, 2014

Here is an excerpt:

This tendency to see meaning in life events seems to reflect a more general aspect of human nature: our powerful drive to reason in psychological terms, to make sense of events and situations by appealing to goals, desires and intentions. This drive serves us well when we think about the actions of other people, who actually possess these psychological states, because it helps us figure out why people behave as they do and to respond appropriately. But it can lead us into error when we overextend it, causing us to infer psychological states even when none exist. This fosters the illusion that the world itself is full of purpose and design.

The entire article is here.

Friday, June 27, 2014

Psychology Can Make the Country Healthier

Insights can improve public health campaigns — and keep them from backfiring

By Crystal Hoyt and Jeni Burnette
Scientific American
Originally published June 10, 2014

Public health communications are designed to tackle significant medical issues such as obesity, AIDS, and cancer. For example, what message can best combat the growing obesity epidemic? Are educational messages effective at increasing condom use? Should cancer prevention messages stress the health risks of too much sun exposure? These are not just medical problems. These are fundamentally questions about perception, beliefs, and behavior. Psychologists bring a unique expertise to these questions and are finding consequential, and often non-intuitive, answers.

The entire article is here.