Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Intuition. Show all posts
Showing posts with label Intuition. Show all posts

Monday, October 1, 2012

Spontaneous giving and calculated greed


D. G. Rand, J. D. Greene & M. A. Nowak
Nature 489, pp 427-430 – doi:10.1038/nature11467

Abstract

Cooperation is central to human social behaviour. However, choosing to cooperate requires individuals to incur a personal cost to benefit others. Here we explore the cognitive basis of cooperative decision-making in humans using a dual-process framework. We ask whether people are predisposed towards selfishness, behaving cooperatively only through active self-control; or whether they are intuitively cooperative, with reflection and prospective reasoning favouring ‘rational’ self-interest. To investigate this issue, we perform ten studies using economic games. We find that across a range of experimental designs, subjects who reach their decisions more quickly are more cooperative. Furthermore, forcing subjects to decide quickly increases contributions, whereas instructing them to reflect and forcing them to decide slowly decreases contributions. Finally, an induction that primes subjects to trust their intuitions increases contributions compared with an induction that promotes greater reflection. To explain these results, we propose that cooperation is intuitive because cooperative heuristics are developed in daily life where cooperation is typically advantageous. We then validate predictions generated by this proposed mechanism. Our results provide convergent evidence that intuition supports cooperation in social dilemmas, and that reflection can undermine these cooperative impulses.


Here is a portion of a review of this article:

The researchers wanted to know whether people's first impulse is cooperative or selfish. To find out, they started by looking at how quickly different people made their choices, and found that faster deciders were more likely to contribute to the common good. 

Next they forced people to go fast or to stop and think, and found the same thing: Faster deciders tended to be more cooperative, and the people who had to stop and think gave less.

Finally, the researchers tested their hypothesis by manipulating people's mindsets. They asked some people to think about the benefits of intuition before choosing how much to contribute. Others were asked to think about the virtues of careful reasoning. Once again, intuition promoted cooperation, and deliberation did the opposite.

While some might interpret the results as suggesting that cooperation is "innate" or "hard-wired," if anything they highlight the role of experience. People who had better opinions of those around them in everyday life showed more cooperative impulses in these experiments, and previous experience with these kinds of studies eroded those impulses. 



Monday, February 13, 2012

The New Synthesis in Moral Psychology

The New Synthesis in Moral Psychology

Sunday, December 18, 2011

The Psychology of Moral Reasoning

Moral Reasoning

This article is found in the public domain here.

Monday, November 7, 2011

Nonrational processes in ethical decision making

By Rogerson, Mark D.; Gottlieb, Michael C.; Handelsman, Mitchell M.; Knapp, Samuel; Younggren, Jeffrey

American Psychologist, Vol 66(7), Oct 2011, 614-623.

Abstract
Most current ethical decision-making models provide a logical and reasoned process for making ethical judgments, but these models are empirically unproven and rely upon assumptions of rational, conscious, and quasilegal reasoning. Such models predominate despite the fact that many nonrational factors influence ethical thought and behavior, including context, perceptions, relationships, emotions, and heuristics. For example, a large body of behavioral research has demonstrated the importance of automatic intuitive and affective processes in decision-making and judgment. These processes profoundly affect human behavior and lead to systematic biases and departures from normative theories of rationality. Their influence represents an important but largely unrecognized component of ethical decision making. We selectively review this work; provide various illustrations; and make recommendations for scientists, trainers, and practitioners to aid them in integrating the understanding of nonrational processes with ethical decision-making.


Ethics Non Rational