Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Rationality. Show all posts
Showing posts with label Rationality. Show all posts

Friday, April 28, 2017

How rational is our rationality?

Interview by Richard Marshall
3 AM Magazine
Originally posted March 18, 2017

Here is an excerpt:

As I mentioned earlier, I think that the point of the study of rationality, and of normative epistemology more generally, is to help us figure out how to inquire, and the aim of inquiry, I believe, is to get at the truth. This means that there had better be a close connection between what we conclude about what’s rational to believe, and what we expect to be true. But it turns out to be very tricky to say what the nature of this connection is! For example, we know that sometimes evidence can mislead us, and so rational beliefs can be false. This means that there’s no guarantee that rational beliefs will be true. The goal of the paper is to get clear about why, and to what extent, it nonetheless makes sense to expect that rational beliefs will be more accurate than irrational ones. One reason this should be of interest to non-philosophers is that if it turns out that there isn’t some close connection between rationality and truth, then we should be much less critical of people with irrational beliefs. They may reasonably say: “Sure, my belief is irrational – but I care about the truth, and since my irrational belief is true, I won’t abandon it!” It seems like there’s something wrong with this stance, but to justify why it’s wrong, we need to get clear on the connection between a judgment about a belief’s rationality and a judgment about its truth. The account I give is difficult to summarize in just a few sentences, but I can say this much: what we say about the connection between what’s rational and what’s true will depend on whether we think it’s rational to doubt our own rationality. If it can be rational to doubt our own rationality (which I think is plausible), then the connection between rationality and truth is, in a sense, surprisingly tenuous.

The interview is here.

Tuesday, April 4, 2017

Illusions in Reasoning

Sangeet S. Khemlani & P. N. Johnson-Laird
Minds & Machines
DOI 10.1007/s11023-017-9421-x

Abstract

Some philosophers argue that the principles of human reasoning are and that mistakes are no more than momentary lapses in ‘‘information processing."  This article makes a case to the contrary. It shows that human reasoners systematic fallacies. The theory of mental models predicts these
errors. It postulates that individuals construct mental models of the possibilities to the premises of an inference refer. But, their models usually represent what is in a possibility, not what is false. This procedure reduces the load on working and for the most part it yields valid inferences. However, as a computer implementing the theory revealed, it leads to fallacious conclusions for inferences—those for which it is crucial to represent what is false in a possibility.  Experiments demonstrate the variety of these fallacies and contrast them control problems, which reasoners tend to get right. The fallacies can be illusions, and they occur in reasoning based on sentential connectives as ‘‘if’’ and ‘‘or’’, quantifiers such as ‘‘all the artists’’ and ‘‘some of the artists’’, deontic relations such as ‘‘permitted’’ and ‘‘obligated’’, and causal relations such causes’’ and ‘‘allows’’. After we have reviewed the principal results, we consider potential for alternative accounts to explain these illusory inferences. And show how the illusions illuminate the nature of human rationality.

Find it here.

Wednesday, February 22, 2017

Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue

Ståhl T, Zaal MP, Skitka LJ (2016)
PLoS ONE 11(11): e0166332. doi:10.1371/journal.pone.0166332

Abstract

In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. Eight studies are reported in which an instrument to measure individual differences in moralized rationality is validated. Results show that the Moralized Rationality Scale (MRS) is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1–3). Furthermore, the MRS has high test-retest reliability (Study 4), is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5). We further demonstrate that the MRS predicts morally laden reactions, such as a desire for punishment, of people who rely on irrational (vs. rational) ways of forming and evaluating beliefs (Studies 6 and 7). Finally, we show that the MRS uniquely predicts motivation to contribute to a charity that works to prevent the spread of irrational beliefs (Study 8). We conclude that (1) there are stable individual differences in the extent to which people moralize a reliance on rationality in the formation and evaluation of beliefs, (2) that these individual differences do not reduce to the personal importance attached to rationality, and (3) that individual differences in moralized rationality have important motivational and interpersonal consequences.

The article is here.

Saturday, January 7, 2017

The Irrationality Within Us

By Elly Vintiadis
Scientific American blog
Originally published on December 12, 2016

We like to think of ourselves as special because we can reason and we like to think that this ability expresses the essence of what it is to be human. In many ways this belief has formed our civilization; throughout history, we have used supposed differences in rationality to justify moral and political distinctions between different races, genders, and species, as well as between “healthy” and “diseased” individuals. Even to this day, people often associate mental disorder with irrationality and this has very real effects on people living with mental disorders.

But are we really that rational? And is rationality really what distinguishes people who live with mental illness from those who do not? It seems not. After decades of research, there is compelling evidence that we are not as rational as we think we are and that, rather than irrationality being the exception, it is part of who we normally are.

So what does it mean to be rational? We usually distinguish between two kinds of rationality.  Epistemic rationality, which is involved in acquiring true beliefs about the world and which sets the standard for what we ought to believe, and instrumental rationality which is involved in decision-making and behavior and is the standard for how we ought to act.

The article is here.

Tuesday, November 8, 2016

The Illusion of Moral Superiority

Ben M. Tappin and Ryan T. McKay
Social Psychological and Personality Science
2016, 1-9

Abstract

Most people strongly believe they are just, virtuous, and moral; yet regard the average person as distinctly less so. This invites accusations of irrationality in moral judgment and perception—but direct evidence of irrationality is absent. Here, we quantify this irrationality and compare it against the irrationality in other domains of positive self-evaluation. Participants (N ¼ 270) judged themselves and the average person on traits reflecting the core dimensions of social perception: morality, agency, and sociability.  Adapting new methods, we reveal that virtually all individuals irrationally inflated their moral qualities, and the absolute and relative magnitude of this irrationality was greater than that in the other domains of positive self-evaluation. Inconsistent with prevailing theories of overly positive self-belief, irrational moral superiority was not associated with self-esteem. Taken together, these findings suggest that moral superiority is a uniquely strong and prevalent form of ‘‘positive illusion,’’ but the underlying function remains unknown.

The article is here.

Thursday, October 20, 2016

Cognitive biases can affect moral intuitions about cognitive enhancement

Lucius Caviola, Adriano Mannino, Julian Savulescu and Nadira Faulmüller
Frontiers in Systems Neuroscience. 2014; 8: 195.
Published online 2014 Oct 15.

Abstract

Research into cognitive biases that impair human judgment has mostly been applied to the area of economic decision-making. Ethical decision-making has been comparatively neglected. Since ethical decisions often involve very high individual as well as collective stakes, analyzing how cognitive biases affect them can be expected to yield important results. In this theoretical article, we consider the ethical debate about cognitive enhancement (CE) and suggest a number of cognitive biases that are likely to affect moral intuitions and judgments about CE: status quo bias, loss aversion, risk aversion, omission bias, scope insensitivity, nature bias, and optimistic bias. We find that there are more well-documented biases that are likely to cause irrational aversion to CE than biases in the opposite direction. This suggests that common attitudes about CE are predominantly negatively biased. Within this new perspective, we hope that subsequent research will be able to elaborate this hypothesis and develop effective de-biasing techniques that can help increase the rationality of the public CE debate and thus improve our ethical decision-making.

The article is here.

Friday, October 2, 2015

You're not irrational, you're just quantum probabilistic

Science Daily
Originally posted September 15, 2015

Here is an excerpt:

Their work suggests that thinking in a quantum-like way¬--essentially not following a conventional approach based on classical probability theory--enables humans to make important decisions in the face of uncertainty, and lets us confront complex questions despite our limited mental resources.

When researchers try to study human behavior using only classical mathematical models of rationality, some aspects of human behavior do not compute. From the classical point of view, those behaviors seem irrational, Wang explained.

For instance, scientists have long known that the order in which questions are asked on a survey can change how people respond--an effect previously thought to be due to vaguely labeled effects, such as "carry-over effects" and "anchoring and adjustment," or noise in the data. Survey organizations normally change the order of questions between respondents, hoping to cancel out this effect. But in the Proceedings of the National Academy of Sciences last year, Wang and collaborators demonstrated that the effect can be precisely predicted and explained by a quantum-like aspect of people's behavior.

The entire article is here.

Saturday, November 10, 2012

Caregiving as moral experience

By Arthur Kleinman
The Lancet
Volume 380, Issue 9853, Pages 1550 - 1551
3 November 2012


Everyone who has been in love or built a family knows that there are things, essential things, that money can't buy. Patients with serious illness and their network of caregivers know this too, because those things that really matter to us are threatened and must be defended. And many clinicians, reflecting on what is at stake in health care not only for patients but for themselves, know the same thing: the market has an important role in health-care financing and health systems reform, but it should not reach into those quintessentials of caregiving that speak to what is most deeply human in medicine and in living. This is the moral limit of an economic paradigm. Or at least it should be.

But we live in a truly confused age. The market model seems to have infiltrated so thoroughly into human lives and medicine that in certain circles—policy making and analysis, hospital and clinic administration, and even clinical work—economic rationality with its imperative of containing costs and maximising efficiency has come to mute the moral, emotional, religious, and aesthetic expressions of patients and caregivers. Most take it for granted and accept its implications. Models from economic psychology, behavioural economics, and business studies, based on the narrowest calculations of what a “rational” person would choose as most cost-effective, are now routinely applied to clinical decision making and the organisation of care.

(cut)

The great failure of contemporary medicine to promote caregiving as an existential practice and moral vision that resists reduction to the market model or the clarion call of efficiency has diminished professionals, patients, and family caregivers alike. It has enabled a noisy and ubiquitous market to all but silence different motives, ideals, hopes, and behaviours that must be expressed, because they are as much who we are as economic rationality.

The entire piece is here.


doi:10.1016/S0140-6736(12)61870-4

Monday, October 1, 2012

Spontaneous giving and calculated greed


D. G. Rand, J. D. Greene & M. A. Nowak
Nature 489, pp 427-430 – doi:10.1038/nature11467

Abstract

Cooperation is central to human social behaviour. However, choosing to cooperate requires individuals to incur a personal cost to benefit others. Here we explore the cognitive basis of cooperative decision-making in humans using a dual-process framework. We ask whether people are predisposed towards selfishness, behaving cooperatively only through active self-control; or whether they are intuitively cooperative, with reflection and prospective reasoning favouring ‘rational’ self-interest. To investigate this issue, we perform ten studies using economic games. We find that across a range of experimental designs, subjects who reach their decisions more quickly are more cooperative. Furthermore, forcing subjects to decide quickly increases contributions, whereas instructing them to reflect and forcing them to decide slowly decreases contributions. Finally, an induction that primes subjects to trust their intuitions increases contributions compared with an induction that promotes greater reflection. To explain these results, we propose that cooperation is intuitive because cooperative heuristics are developed in daily life where cooperation is typically advantageous. We then validate predictions generated by this proposed mechanism. Our results provide convergent evidence that intuition supports cooperation in social dilemmas, and that reflection can undermine these cooperative impulses.


Here is a portion of a review of this article:

The researchers wanted to know whether people's first impulse is cooperative or selfish. To find out, they started by looking at how quickly different people made their choices, and found that faster deciders were more likely to contribute to the common good. 

Next they forced people to go fast or to stop and think, and found the same thing: Faster deciders tended to be more cooperative, and the people who had to stop and think gave less.

Finally, the researchers tested their hypothesis by manipulating people's mindsets. They asked some people to think about the benefits of intuition before choosing how much to contribute. Others were asked to think about the virtues of careful reasoning. Once again, intuition promoted cooperation, and deliberation did the opposite.

While some might interpret the results as suggesting that cooperation is "innate" or "hard-wired," if anything they highlight the role of experience. People who had better opinions of those around them in everyday life showed more cooperative impulses in these experiments, and previous experience with these kinds of studies eroded those impulses.