Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Automatic. Show all posts
Showing posts with label Automatic. Show all posts

Sunday, February 25, 2018

The Moral Importance of Reflective Empathy

Ingmar Persson and Julian Savulescu
J. Neuroethics (2017). https://doi.org/10.1007/s12152-017-9350-7

Abstract

This is a reply to Jesse Prinz and Paul Bloom’s skepticism about the moral importance of empathy. It concedes that empathy is spontaneously biased to individuals who are spatio-temporally close, as well as discriminatory in other ways, and incapable of accommodating large numbers of individuals. But it is argued that we could partly correct these shortcomings of empathy by a guidance of reason because empathy for others consists in imagining what they feel, and, importantly, such acts of imagination can be voluntary – and, thus, under the influence of reflection – as well as automatic. Since empathizing with others motivates concern for their welfare, a reflectively justified empathy will lead to a likewise justified altruistic concern. In addition, we argue that such concern supports another central moral attitude, namely a sense of justice or fairness.

From the Conclusion

All in all, the picture that emerges is this. We have beliefs about how other individuals feel and how we can help them to feel better. There is both a set of properties such that: (1) if we believe individuals have any of these properties, this facilitates spontaneous empathy with these individuals, i.e. disposes us to imagine spontaneously how they feel, and (2) a set of properties such that if we believe that individuals have any of them, this hinders spontaneous empathy with them. In the former case, we will be spontaneously concerned about the well-being of these individuals; in the latter case, it will take voluntary reflection to empathize and be concerned about the individuals in question. We are also in possession of a sense of justice or fairness which not only animates us to benefit those whom justice requires to be benefited, but also to harm those whom justice requires be harmed.

The article can be accessed here.

Wednesday, February 1, 2017

Why It’s So Hard to Train Someone to Make an Ethical Decision

Eugene Soltes
Harvard Business Review
Originally posted January 11, 2017

Here is an excerpt:

The second factor distinguishing training exercises from real-life decision making is that training inevitably exposes different points of views and judgments. Although many organizations outwardly express a desire for a diversity of opinions, in practice those differing viewpoints are often stifled by the desire to agree or appease others. Even at the most senior levels of the organization, independent directors struggle to dissent. For instance, Dennis Kozlowski, the former CEO of Tyco who grew the firm from obscurity into a global conglomerate but later faced criminal charges for embezzlement, recalled the challenge of board members genuinely disagreeing and pushing back on him as CEO when the firm was performing well. “When the CEO is in the room, directors — even independent directors — tend to want to try to please him,” Kozlowski explained. “The board would give me anything I wanted. Anything.”

Finally, unlike in training, when a single decision might be given an hour of careful analysis, most actual decisions are made quickly and rely on intuition rather than careful, reflective reasoning. This can be especially problematic for moral decisions, which often rely on routine and intuitions that produce mindless judgements that don’t match up with how we’d desire to respond if we considered the decision with more time.

The article is here.

Editor's note: While I agree that it can be difficult to teach someone to make an ethical decision, maybe we can develop alternative ways to teach ethical decision-making. Ethics education requires attention to how personal values blend with work responsibilities, emotional reactions to ethical dilemmas, and biases and heuristics related to decision-making skills in general, and ethics in particular.  If an individual feels pressure to make a decision, then there are typically ways to slow down the process.  Finally, ethics education can include quality enhancement strategies, including redundant protections and consultation, that can bolster the opportunity for better outcomes.

Monday, July 18, 2016

Cooperation, Fast and Slow: Meta-Analytic Evidence for a Theory of Social Heuristics and Self-Interested Deliberation

David G. Rand
(In press).
Psychological Science.

Abstract

Does cooperating require the inhibition of selfish urges? Or does “rational” self-interest constrain cooperative impulses? I investigated the role of intuition and deliberation in cooperation by meta-analyzing 67 studies in which cognitive-processing manipulations were applied to economic cooperation games (total N = 17,647; no indication of publication bias using Egger’s test, Begg’s test, or p-curve). My meta-analysis was guided by the Social Heuristics Hypothesis, which proposes that intuition favors behavior that typically maximizes payoffs, whereas deliberation favors behavior that maximizes one’s payoff in the current situation. Therefore, this theory predicts that deliberation will undermine pure cooperation (i.e., cooperation in settings where there are few future consequences for one’s actions, such that cooperating is never in one’s self-interest) but not strategic cooperation (i.e., cooperation in settings where cooperating can maximize one’s payoff). As predicted, the meta-analysis revealed 17.3% more pure cooperation when intuition was promoted relative to deliberation, but no significant difference in strategic cooperation between intuitive and deliberation conditions.

The article is here.

Thursday, April 28, 2016

The Visual Guide to Morality: Vision as an Integrative Analogy for Moral Experience, Variability and Mechanism

Chelsea Schein, Neil Hester and Kurt Gray
Social and Personality Psychology Compass 10/4 (2016): 231–251

Abstract

Analogies help organize, communicate and reveal scientific phenomena. Vision may be the best analogy for understanding moral judgment. Although moral psychology has long noted similarities between seeing and judging, we systematically review the “morality is like vision” analogy through
three elements: experience, variability and mechanism. Both vision and morality are experienced as automatic, durable and objective. However, despite feelings of objectivity, both vision and morality show substantial variability across biology, culture and situation. The paradox of objective experience and cultural subjectivity is best understood through constructionism, as both vision and morality involve the flexible combination of more basic ingredients. Specifically, both vision and morality involve a mechanism that demonstrates Gestalt, combination and coherence. The “morality is like vision” analogy not only provides intuitive organization and compelling communication for moral psychology but also speaks to debates in the field, such as intuition versus reason, pluralism versus universalism and modularity versus constructionism.

The article is here.