Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Saturday, October 8, 2016

The Irrational Idea That Humans Are Mostly Irrational

Paul Bloom
The Atlantic
Originally posted September 16, 2016

Last summer I was at a moral psychology conference in Chile, listening to speaker after speaker discuss research into how people think about sexuality, crime, taxation, and other politically and socially fraught issues. The consensus was that human moral reasoning is a mess—irrational, contradictory, and incoherent.

And how could it be otherwise? The evolutionary psychologists in the room argued that our propensity to reason about right and wrong arises through social adaptations calibrated to enhance our survival and reproduction, not to arrive at consistent or objective truth. And according to the social psychologists, we are continually swayed by irrelevant factors, by gut feelings and unconscious motivations. As the primatologist Frans de Waal once put it, summing up the psychological consensus: “We celebrate rationality, but when push comes to shove we assign it little weight.”

I think that this is mistaken. Yes, our moral capacities are far from perfect. But—as I’ve argued elsewhere, including in my forthcoming book on empathy—we are often capable of objective moral reasoning. And so we can arrive at novel, sometimes uncomfortable, moral positions, as when men appreciate the wrongness of sexism or when people who really like the taste of meat decide that it’s better to go without.

The article is here.

Friday, October 7, 2016

The Difference Between Rationality and Intelligence

By David Hambrick and Alexander Burgoyne
The New York Times
Originally published September 16, 2016

Here is an excerpt:

Professor Morewedge and colleagues found that the computer training led to statistically large and enduring decreases in decision-making bias. In other words, the subjects were considerably less biased after training, even after two months. The decreases were larger for the subjects who received the computer training than for those who received the video training (though decreases were also sizable for the latter group). While there is scant evidence that any sort of “brain training” has any real-world impact on intelligence, it may well be possible to train people to be more rational in their decision making.

The article is here.

Three Ways To Prevent Getting Set Up For Ethical Failure

Ron Carucci
Forbes.com
Originally posted

Here are two excerpts:

To survive the injustice of unresolved competing goals, leaders, usually middle management, become self-protective, putting the focus of their team or department ahead of others. Such self-protection turns to self-interest as chronic pain persists from living in the gap between unrealistic demands and unfair resource allocation. Resentment turns to justification as people conclude, “I’m not going down with the ship.” And eventually, unfettered self-interest and its inherent justification become conscious choices to compromise, usually from a sense of entitlement. People simply conclude, “I have no choice” or “I deserve this.” Says Jonathan Haidt, Professor of Business Ethics at NYU and founder of Ethical Systems, “Good people will do terrible things when people around them are even gently encouraging them to do so.” In many cases, that “gentle encouragement” comes in the form of simply ignoring what might provoke poor choices.

(cut)

3. Clarify decision rights. Organizational governance – which is different from “Corporate Governance” – is the distribution of authority, resources, and decision rights across an organization. Carefully designed, it synchronizes an organization and ensures natural tensions are openly managed. Knowing which leaders are accountable for which decisions and resources removes the uncertainty many organizations suffer from. When there is confusion about decision rights, competing priorities proliferate, setting the stage for organizational contradictions to arise.

The article is here.

Thursday, October 6, 2016

How Morality Changes in a Foreign Language

By Julie Sedivy
Scientific American
Originally published September 14, 2016

Here is an excerpt:

Why does it matter whether we judge morality in our native language or a foreign one? According to one explanation, such judgments involve two separate and competing modes of thinking—one of these, a quick, gut-level “feeling,” and the other, careful deliberation about the greatest good for the greatest number. When we use a foreign language, we unconsciously sink into the more deliberate mode simply because the effort of operating in our non-native language cues our cognitive system to prepare for strenuous activity. This may seem paradoxical, but is in line with findings that reading math problems in a hard-to-read font makes people less likely to make careless mistakes (although these results have proven difficult to replicate).

An alternative explanation is that differences arise between native and foreign tongues because our childhood languages vibrate with greater emotional intensity than do those learned in more academic settings. As a result, moral judgments made in a foreign language are less laden with the emotional reactions that surface when we use a language learned in childhood.

How Unconscious Bias Is Affecting Our Ability To Listen

Vivian Giang
The Fast Company
Originally published September 8, 2016

Here is an excerpt:

Meghan Sumner, an associate professor of linguistics at Stanford University, stumbled into the unconscious bias realm after years of investigating how listeners extract information from voices, and how the pieces of information are stored in our memory. Study after study, she found that we all listen differently based on where we’re from and our feelings toward different accents. It’s not a conscious choice, but the result of social biases that form unconscious stereotyping which then influences that way we listen.

"It’s not always what someone said, it’s also how they said it," Sumner tells Fast Company. "How we view people socially from their voice, influences how we attend to them, how we listen to them."

For instance, in one experiment, Sumner found that the "average American listener" preferred a "Southern Standard British English" voice rather than one who had a New York City accent, even if both voices are saying the same words. Consequently, the listener will remember more of what the English speaker says and will deem them as smarter. All of this is impacted by the stereotypes that we have of British people and New Yorkers.

The article is here.

Wednesday, October 5, 2016

Can Morality Be Taught?

Ashley Lamb-Sinclair
The Atlantic
Originally published September 14, 2016

Here is an excerpt:

I am especially disheartened, as are many Americans, when I consider the events of this past summer alone—bombings, riots, shootings—every bit of which derive from a need to identify and destroy the other, or, at the very least, a refusal to understand each other’s perspective. Then there is the presidential campaign with Donald Trump proclaiming “the other” as the source of many societal ills.

Arguments abound regarding laws to pass and policies to implement as solutions to these issues. And while passing bills might feel like a solution—and in some ways it would be—policy can only go so far in changing habits and perception. The only surefire solution to developing tolerance and openness to the perspectives of others is through educating young people.

I believe that the problem is not what is taught in schools, but how it is taught. It is not enough to simply offer curriculum about the ills of racism, homophobia, or bullying, and then expect lasting results from students who are entrenched in cultural beliefs that are reinforced by society.

The article is here.

Is Robust Moral Realism a kind of Religious Belief?

John Danaher
Philosophical Disquisitions
Originally posted September 11, 2016

Robust moral realism is the view that moral facts exist, but that they are not reducible to non-moral or natural facts. According to the robust realist, when I say something like ‘It is morally wrong to torture an innocent child for fun’, I am saying something that is true, but whose truth is not reducible to the non-moral properties of torture or children. Robust moral realism has become surprisingly popular in recent years, with philosophers like Derek Parfit, David Enoch, Erik Wielenberg and Russell Shafer-Landau all defending versions of it.

What is interesting about these philosophers is that they are all avowedly non-religious in their moral beliefs. They don’t think there is any connection between morality and the truths of any particular religion. Indeed, several of them are explicitly atheistic in their moral outlook. In a recent paper, however, David Killoren has argued that robust moral realism is a kind of religious belief: one that must be held on faith and that shares other properties with popular religions. At the same time, he argues that it is an ‘excellent’ kind of religious belief, one that could be attractive to the non-religious and religious alike.

Tuesday, October 4, 2016

Whatever you think, you don’t necessarily know your own mind

Keith Frankish
aeon.co
Originally published May 27, 2016

Do  you think racial stereotypes are false? Are you sure? I’m not asking if you’re sure whether or not the stereotypes are false, but if you’re sure whether or not you think that they are. That might seem like a strange question. We all know what we think, don’t we?

Most philosophers of mind would agree, holding that we have privileged access to our own thoughts, which is largely immune from error. Some argue that we have a faculty of ‘inner sense’, which monitors the mind just as the outer senses monitor the world. There have been exceptions, however. The mid-20th-century behaviourist philosopher Gilbert Ryle held that we learn about our own minds, not by inner sense, but by observing our own behaviour, and that friends might know our minds better than we do. (Hence the joke: two behaviourists have just had sex and one turns to the other and says: ‘That was great for you, darling. How was it for me?’) And the contemporary philosopher Peter Carruthers proposes a similar view (though for different reasons), arguing that our beliefs about our own thoughts and decisions are the product of self-interpretation and are often mistaken.

Replacing the Moral Foundations: An Evolutionary-Coalitional Theory of Liberal-Conservative Differences

Jeffrey S. Sinn, Matthew W. Hayes
Political Psychology
First published: August 2016

Abstract

Moral Foundations Theory (MFT) explains liberal-conservative differences as arising from different moral intuitions, with liberals endorsing “individualizing” foundations (Harm and Fairness) and conservatives also endorsing “binding” foundations (Authority, Respect, and Purity). We argue these labels misconstrue ideological differences and propose Evolutionary-Coalitional Theory (ECT) as an alternative, explaining how competitive dynamics in the ancestral social environment could produce the observed ideological differences. We test ECT against MFT across three studies. Study 1 shows the so-called “binding” orientation entails the threat-sensitivity and outgroup antagonism predicted by ECT; that is, an authoritarian motive. Similarly, Study 2 shows the so-called “individualizing” orientation is better described as a universalizing motive, one reflecting a broader set of moral commitments (e.g., to nature) and a broader sociality than the egocentrism implied by MFT. Study 3 provides a factor analysis reducing “binding” to authoritarianism and “individualizing” to universalism, with the latter loading against social dominance orientation (SDO). A hierarchical regression then provides additional evidence for ECT, showing this dominating motive (SDO) accounts for variance in conservatism that MFT leaves unexplained. Collectively, these three studies suggest that ECT offers a more accurate and precise explanation of the key psychological differences between liberals and conservatives.

The article is here.