Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Rational. Show all posts
Showing posts with label Rational. Show all posts

Sunday, October 18, 2020

Beliefs have a social purpose. Does this explain delusions?

Anna Greenburgh
psyche.co
Originally published 

Here is an excerpt:

Of course, just because a delusion has logical roots doesn’t mean it’s helpful for the person once it takes hold. Indeed, this is why delusions are an important clinical issue. Delusions are often conceptualised as sitting at the extreme end of a continuum of belief, but how can they be distinguished from other beliefs? If not irrationality, then what demarcates a delusion?

Delusions are fixed, unchanging in the face of contrary evidence, and not shared by the person’s peers. In light of the social function of beliefs, these preconditions have added significance. The coalitional model underlines that beliefs arising from adaptive cognitive processes should show some sensitivity to social context and enable successful social coordination. Delusions lack this social function and adaptability. Clinical psychologists have documented the fixity of delusional beliefs: they are more resistant to change than other types of belief, and are intensely preoccupying, regardless of the social context or interpersonal consequences. In both ‘The Yellow Wallpaper’ and the novel Don Quixote (1605-15) by Miguel de Cervantes, the protagonists’ beliefs about their surroundings are unchangeable and, if anything, become increasingly intense and disruptive. It is this inflexibility to social context, once they take hold, that sets delusions apart from other beliefs.

Across the field of mental health, research showing the importance of the social environment has spurred a great shift in the way that clinicians interact with patients. For example, research exposing the link between trauma and psychosis has resulted in more compassionate, person-centred approaches. The coalitional model of delusions can now contribute to this movement. It opens up promising new avenues of research, which integrate our fundamental social nature and the social function of belief formation. It can also deepen how people experiencing delusions are understood – instead of contributing to stigma by dismissing delusions as irrational, it considers the social conditions that gave rise to such intensely distressing beliefs.

Tuesday, February 19, 2019

How Our Attitude Influences Our Sense Of Morality

Konrad Bocian
Science Trend
Originally posted January 18, 2019

Here is an excerpt:

People think that their moral judgment is as rational and objective as scientific statements, but science does not confirm that belief. Within the two last decades, scholars interested in moral psychology discovered that people produce moral judgments based on fast and automatic intuitions than rational and controlled reasoning. For example, moral cognition research showed that moral judgments arise in approximately 250 milliseconds, and even then we are not able to explain them. Developmental psychologists proved that at already the age of 3 months, babies who do not have any lingual skills can distinguish a good protagonist (a helping one) from a bad one (a hindering one). But this does not mean that peoples’ moral judgments are based solely on intuitions. We can use deliberative processes when conditions are favorable – when we are both motivated to engage in and capable of conscious responding.

When we imagine how we would morally judge other people in a specific situation, we refer to actual rules and norms. If the laws are violated, the act itself is immoral. But we forget that intuitive reasoning also plays a role in forming a moral judgment. It is easy to condemn the librarian when our interest is involved on paper, but the whole picture changes when real money is on the table. We have known that rule for a very long time, but we still forget to use it when we predict our moral judgments.

Based on previous research on the intuitive nature of moral judgment, we decided to test how far our attitudes can impact our perception of morality. In our daily life, we meet a lot of people who are to some degree familiar, and we either have a positive or negative attitude toward these people.

The info is here.

Wednesday, February 13, 2019

The Art of Decision-Making

Joshua Rothman
The New Yorker
Originally published January 21, 2019

Here is an excerpt:

For centuries, philosophers have tried to understand how we make decisions and, by extension, what makes any given decision sound or unsound, rational or irrational. “Decision theory,” the destination on which they’ve converged, has tended to hold that sound decisions flow from values. Faced with a choice—should we major in economics or in art history?—we first ask ourselves what we value, then seek to maximize that value.

From this perspective, a decision is essentially a value-maximizing equation. If you’re going out and can’t decide whether to take an umbrella, you could come to a decision by following a formula that assigns weights to the probability of rain, the pleasure you’ll feel in strolling unencumbered, and the displeasure you’ll feel if you get wet. Most decisions are more complex than this, but the promise of decision theory is that there’s a formula for everything, from launching a raid in Abbottabad to digging an oil well in the North Sea. Plug in your values, and the right choice pops out.

In recent decades, some philosophers have grown dissatisfied with decision theory. They point out that it becomes less useful when we’re unsure what we care about, or when we anticipate that what we care about might shift.

The info is here.

Friday, November 17, 2017

Going with your gut may mean harsher moral judgments

Jeff Sossamon
www.futurity.org
Originally posted November 2, 2017

Going with your intuition could make you judge others’ moral transgressions more harshly and keep you from changing your mind, even after considering all the facts, a new study suggests.

The findings show that people who strongly rely on intuition automatically condemn actions they perceive to be morally wrong, even if there is no actual harm.

In psychology, intuition, or “gut instinct,” is defined as the ability to understand something immediately, without the need for reasoning.

“It is now widely acknowledged that intuitive processing influences moral judgment,” says Sarah Ward, a doctoral candidate in social and personality psychology at the University of Missouri.

“We thought people who were more likely to trust their intuition would be more likely to condemn things that are shocking, whereas people who don’t rely on gut feelings would not condemn these same actions as strongly,” Ward says.

Ward and Laura King, professor of psychological sciences, had study participants read through a series of scenarios and judge whether the action was wrong, such as an individual giving a gift to a partner that had previously been purchased for an ex.

The article is here.

Wednesday, December 7, 2016

Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue

Tomas Ståhl, Maarten P. Zaal, Linda J. Skitka
PLOS One
Published: November 16, 2016

Abstract

In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. Eight studies are reported in which an instrument to measure individual differences in moralized rationality is validated. Results show that the Moralized Rationality Scale (MRS) is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1–3). Furthermore, the MRS has high test-retest reliability (Study 4), is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5). We further demonstrate that the MRS predicts morally laden reactions, such as a desire for punishment, of people who rely on irrational (vs. rational) ways of forming and evaluating beliefs (Studies 6 and 7). Finally, we show that the MRS uniquely predicts motivation to contribute to a charity that works to prevent the spread of irrational beliefs (Study 8). We conclude that (1) there are stable individual differences in the extent to which people moralize a reliance on rationality in the formation and evaluation of beliefs, (2) that these individual differences do not reduce to the personal importance attached to rationality, and (3) that individual differences in moralized rationality have important motivational and interpersonal consequences.

Friday, October 7, 2016

The Difference Between Rationality and Intelligence

By David Hambrick and Alexander Burgoyne
The New York Times
Originally published September 16, 2016

Here is an excerpt:

Professor Morewedge and colleagues found that the computer training led to statistically large and enduring decreases in decision-making bias. In other words, the subjects were considerably less biased after training, even after two months. The decreases were larger for the subjects who received the computer training than for those who received the video training (though decreases were also sizable for the latter group). While there is scant evidence that any sort of “brain training” has any real-world impact on intelligence, it may well be possible to train people to be more rational in their decision making.

The article is here.