Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Heuristics. Show all posts
Showing posts with label Heuristics. Show all posts

Thursday, March 12, 2015

Cognitive biases can affect moral intuitions about cognitive enhancement

By Caviola Lucius, Mannino Adriano, Savulescu Julian, Faulmüller Nadira
Front. Syst. Neurosci., 15 October 2014 | doi: 10.3389/fnsys.2014.00195

Research into cognitive biases that impair human judgment has mostly been applied to the area of economic decision-making. Ethical decision-making has been comparatively neglected. Since ethical decisions often involve very high individual as well as collective stakes, analyzing how cognitive biases affect them can be expected to yield important results. In this theoretical article, we consider the ethical debate about cognitive enhancement (CE) and suggest a number of cognitive biases that are likely to affect moral intuitions and judgments about CE: status quo bias, loss aversion, risk aversion, omission bias, scope insensitivity, nature bias, and optimistic bias. We find that there are more well-documented biases that are likely to cause irrational aversion to CE than biases in the opposite direction. This suggests that common attitudes about CE are predominantly negatively biased. Within this new perspective, we hope that subsequent research will be able to elaborate this hypothesis and develop effective de-biasing techniques that can help increase the rationality of the public CE debate and thus improve our ethical decision-making.

The entire article is here.

Sunday, December 21, 2014

‘‘End-of-life” biases in moral evaluations of others

By George E. Newman, Kristi L. Lockhart, Frank C. Keil
Cognition, in press

Abstract

When evaluating the moral character of others, people show a strong bias to more heavily weigh behaviors at the end of an individual’s life, even if those behaviors arise in light of an overwhelmingly longer duration of contradictory behavior. Across four experiments, we find that this ‘‘end-of-life” bias uniquely applies to intentional changes in behavior that immediately precede death, and appears to result from the inference that the behavioral change reflects the emergence of the individual’s ‘‘true self”.

The entire article is here.

Wednesday, December 10, 2014

"How Do You Change People's Minds About What Is Right And Wrong?"

By David Rand
Edge Video
Originally posted November 18, 2014

I'm a professor of psychology, economics and management at Yale. The thing that I'm interested in, and that I spend pretty much all of my time thinking about, is cooperation—situations where people have the chance to help others at a cost to themselves. The questions that I'm interested in are how do we explain the fact that, by and large, people are quite cooperative, and even more importantly, what can we do to get people to be more cooperative, to be more willing to make sacrifices for the collective good?

There's been a lot of work on cooperation in different fields, and certain basic themes have emerged, what you might call mechanisms for promoting cooperation: ways that you can structure interactions so that people learn to cooperate. In general, if you imagine that most people in a group are doing the cooperative thing, paying costs to help the group as a whole, but there's some subset that's decided "Oh, we don't feel like it; we're just going to look out for ourselves," the selfish people will be better off. Then, either through an evolutionary process or an imitation process, that selfish behavior will spread.

The entire video and transcript is here.

Tuesday, December 9, 2014

Cognitive enhancement, legalising opium, and cognitive biases

By Joao Fabiano
Practical Ethics Blog
Originally published November 18, 2014

Suppose you want to enhance your cognition. A scientist hands you two drugs. Drug X has at least 19 controlled studies on the healthy individual showing it is effective, and while a handful of studies report a slight increase in blood pressure, another dozen conclude it is safe and non-addictive. Drug Y is also effective, but it increases mortality, has addiction potential and withdrawal symptoms. Which one do you choose? Great. Before you reach out for Drug X, the scientist warns you, “I should add, however, that Drug Y has been used by certain primitive communities for centuries, while Drug X has not.” Which one do you choose? Should this information have any bearing on your choice? I don’t think so. You probably conclude that primitive societies do all sort of crazy things and you would be better off with actual, double-blind, controlled studies.

The entire blog post is here.

Thursday, October 16, 2014

It’s All for Your Own Good

By Jeremy Waldron
The New York Book Review
Originally published on October 9, 2014

Here is an excerpt:

Nudging is an attractive strategy. People are faced with choices all the time, from products to pensions, from vacations to voting, from requests for charity to ordering meals in a restaurant, and many of these choices have to be made quickly or life would be overwhelming. For most cases the sensible thing is not to agonize but to use a rule of thumb—a heuristic is the technical term—to make the decision quickly. “If it ain’t broke don’t fix it,” “Choose a round number,” “Always order the special,” and “Vote the party line” are all heuristics. But the ones people use are good for some decisions and not others, and they have evolved over a series of past situations that may or may not resemble the important choices people currently face.

The entire article is here.

Sunday, August 31, 2014

Fast, Frugal, and (Sometimes) Wrong

Cass R. Sunstein
University of Chicago Law School and Department of Political Science
Originally published in 2005

Abstract

Do moral heuristics operate in the moral domain? If so, do they lead to moral errors? This brief essay offers an affirmative answer to both questions. In so doing, it responds to an essay by Gerd Gigerenzer on the nature of heuristics, moral and otherwise. While focused on morality, the discussion bears on the general debate between those who emphasize cognitive errors, sometimes produced by heuristics, and those who emphasize the frequent success of heuristics in producing sensible judgments in the real world. General claims are that it is contentious to see moral problems as ones of arithmetic, and that arguments about moral heuristics will often do well to steer clear of contentious arguments about what morality requires.

(cut)

But no one should deny that in many contexts, moral and other heuristics, in the form of simple rules of thumb, lead to moral error on any plausible view of morality. Consider, for example, the idea, emphasized by Gigerenzer, that one ought to do as the majority does, a source of massive moral blunders (see Sunstein, 2003). Or consider the fast and frugal idea that one ought not to distort the truth—a heuristic that generally works well, but that also leads (in my view) to moral error when, for example, the distortion is necessary to avoid significant numbers of deaths. Or consider the act- omission distinction, which makes moral sense in many domains, but which can lead to unsupportable moral judgments as well (Baron, 2004).

The entire article is here.

Monday, November 7, 2011

Nonrational processes in ethical decision making

By Rogerson, Mark D.; Gottlieb, Michael C.; Handelsman, Mitchell M.; Knapp, Samuel; Younggren, Jeffrey

American Psychologist, Vol 66(7), Oct 2011, 614-623.

Abstract
Most current ethical decision-making models provide a logical and reasoned process for making ethical judgments, but these models are empirically unproven and rely upon assumptions of rational, conscious, and quasilegal reasoning. Such models predominate despite the fact that many nonrational factors influence ethical thought and behavior, including context, perceptions, relationships, emotions, and heuristics. For example, a large body of behavioral research has demonstrated the importance of automatic intuitive and affective processes in decision-making and judgment. These processes profoundly affect human behavior and lead to systematic biases and departures from normative theories of rationality. Their influence represents an important but largely unrecognized component of ethical decision making. We selectively review this work; provide various illustrations; and make recommendations for scientists, trainers, and practitioners to aid them in integrating the understanding of nonrational processes with ethical decision-making.


Ethics Non Rational