Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Behavior. Show all posts
Showing posts with label Behavior. Show all posts

Wednesday, October 12, 2016

Why psychology lost its soul: everything comes from the brain

George Paxinos
The Conversation
Originally published September 22, 2016

Many people today believe they possess a soul. While conceptions of the soul differ, many would describe it as an “invisible force that appears to animate us”.

It’s often believed the soul can survive death and is intimately associated with a person’s memories, passions and values. Some argue the soul has no mass, takes no space and is localised nowhere.

But as a neuroscientist and psychologist, I have no use for the soul. On the contrary, all functions attributable to this kind of soul can be explained by the workings of the brain.

Psychology is the study of behaviour. To carry out their work of modifying behaviour, such as in treating addiction, phobia, anxiety and depression, psychologists do not need to assume people have souls. For the psychologists, it is not so much that souls do not exist, it is that there is no need for them.

It is said psychology lost its soul in the 1930s. By this time, the discipline fully became a science, relying on experimentation and control rather than introspection.

The article is here.

Tuesday, June 14, 2016

The Immoral Landscape? Scientists Are Associated with Violations of Morality

Rutjens BT, Heine SJ
(2016)  PLoS ONE 11(4): e0152798.
doi:10.1371/journal.pone.0152798

Abstract

Do people think that scientists are bad people? Although surveys find that science is a highly respected profession, a growing discourse has emerged regarding how science is often judged negatively. We report ten studies (N = 2328) that investigated morality judgments of scientists and compared those with judgments of various control groups, including atheists. A persistent intuitive association between scientists and disturbing immoral conduct emerged for violations of the binding moral foundations, particularly when this pertained to violations of purity. However, there was no association in the context of the individualizing moral foundations related to fairness and care. Other evidence found that scientists were perceived as similar to others in their concerns with the individualizing moral foundations of fairness and care, yet as departing for all of the binding foundations of loyalty, authority, and purity. Furthermore, participants stereotyped scientists particularly as robot-like and lacking emotions, as well as valuing knowledge over morality and being potentially dangerous. The observed intuitive immorality associations are partially due to these explicit stereotypes but do not correlate with any perceived atheism. We conclude that scientists are perceived not as inherently immoral, but as capable of immoral conduct.

The article is here.

Monday, April 18, 2016

The Benjamin Franklin Effect

David McRaney
You Are Not So Smart Blog: A Celebration of Self Delusion
October 5, 2011

The Misconception: You do nice things for the people you like and bad things to the people you hate.

The Truth: You grow to like people for whom you do nice things and hate people you harm.

(cut)

Sometimes you can’t find a logical, moral or socially acceptable explanation for your actions. Sometimes your behavior runs counter to the expectations of your culture, your social group, your family or even the person you believe yourself to be. In those moments you ask, “Why did I do that?” and if the answer damages your self-esteem, a justification is required. You feel like a bag of sand has ruptured in your head, and you want relief. You can see the proof in an MRI scan of someone presented with political opinions which conflict with their own. The brain scans of a person shown statements which oppose their political stance show the highest areas of the cortex, the portions responsible for providing rational thought, get less blood until another statement is presented which confirms their beliefs. Your brain literally begins to shut down when you feel your ideology is threatened. Try it yourself. Watch a pundit you hate for 15 minutes. Resist the urge to change the channel. Don’t complain to the person next to you. Don’t get online and rant. Try and let it go. You will find this is excruciatingly difficult.

The blog post is here.

Note: How do you perceive complex patients or those who do not respond well to psychotherapy?

Friday, March 25, 2016

Probing the relationship between brain activity and moral judgments of children

ScienceCodex News
Originally published March 9, 2016

Here is an excerpt:

To determine whether the early automatic or later controlled neural activity predicted actual moral behavior, the researchers then assessed the children's generosity based on how many stickers they were willing to share with an anonymous child. They then correlated the children's generosity with individual differences in brain activity generated during helping versus harming scenes. Only differences in brain signals associated with deliberate neural processing predicted the children's sharing behavior, suggesting that moral behavior in children depends more on controlled reflection than on an immediate emotional response.

The article is here.

Monday, February 22, 2016

Morality is a muscle. Get to the gym.

Pascal-Emmanuel Gobry
The Week
Originally published January 18, 2016

Here is an excerpt:

Take the furor over "trigger warnings" in college classes and textbooks. One side believes that in order to protect the sensitivities of some students, professors or writers should warn readers or students about some at the beginning of an article or course about controversial topics. Another side says that if someone can't handle rough material, then he can stop reading or step out of the room, and that trigger warnings are an unconscionable affront to freedom of thought. Interestingly, both schools clearly believe that there is one moral stance which takes the form of a rule that should be obeyed always and everywhere. Always and everywhere we should have trigger warnings to protect people's sensibilities, or always and everywhere we should not.

Both sides need a lecture in virtue ethics.

If I try to stretch my virtue of empathy, it doesn't seem at all absurd to me to imagine that, say, a young woman who has been raped might be made quite uncomfortable by a class discussion of rape in literature, and that this is something to which we should be sensitive. But the trigger warning people maybe should think more about the moral imperative to develop the virtue of courage, including intellectual courage. Then it seems to me that if you just put aside grand moral questions about freedom of inquiry, simple basic human courtesy would mean a professor would try to take account a trauma victim's sensibilities while teaching sensitive material, and students would understand that part of the goal of a college class is to challenge them. We don't need to debate universal moral values, we just need to be reminded to exercise virtue more.

The article is here.

Monday, February 15, 2016

If You’re Loyal to a Group, Does It Compromise Your Ethics?

By Francesca Gino
Harvard Business Review
Originally posted January 06, 2016

Here are two excerpts:

Most of us feel loyalty, whether to our clan, our comrades, an organization, or a cause. These loyalties are often important aspects of our social identity. Once a necessity for survival and propagation of the species, loyalty to one’s in-group is deeply rooted in human evolution.

But the incidents of wrongdoing that capture the headlines make it seem like loyalty is all too often a bad thing, corrupting many aspects of our personal and professional lives. My recent research, conducted in collaboration with Angus Hildreth of the University of California, Berkeley and Max Bazerman of Harvard Business School, suggests that this concern about loyalty is largely misplaced. In fact, we found loyalty to a group can increase, rather than decrease, honest behavior.

(cut)

As our research shows, loyalty can be a driver of good behavior, but when competition among groups is high, it can lead us to behave unethically. When we are part of a group of loyal members, traits associated with loyalty — such as honor, honesty, and integrity — are very salient in our minds. But when loyalty seems to demand a different type of goal, such as competing with other groups and winning at any cost, behaving ethically becomes a less important goal.

The article is here.

Tuesday, February 2, 2016

What Makes Us Cheat? Experiment 2

by Simon Oxenham
BigThink
Originally published January 13, 2016

Dan Ariely, the psychologist who popularised behavioral economics, has made a fascinating documentary exploring what makes us dishonest. I’ve just finished watching it and it’s something of a masterpiece of psychological storytelling, delving deep into contemporary tales of dishonesty, and supporting its narrative with cunningly designed experiments that have been neatly reconstructed for the film camera.

Self-Deception



The article is here.

Monday, February 1, 2016

What Makes Us Cheat? Experiment 1

by Simon Oxenham
BigThink
Originally published January 13, 2016

Dan Ariely, the psychologist who popularised behavioral economics, has made a fascinating documentary exploring what makes us dishonest. I’ve just finished watching it and it’s something of a masterpiece of psychological storytelling, delving deep into contemporary tales of dishonesty, and supporting its narrative with cunningly designed experiments that have been neatly reconstructed for the film camera.

Matrix Experiments and Big Cheaters vs Little Cheaters




The article is here.

Friday, January 29, 2016

Research suggests morality can survive without religion

By Brooks Hays
UPI
Originally posted January 13, 2016

Results from a longitudinal survey suggest morality hasn't declined with the decline of organized religion. The findings were published in the journal Politics and Religion.

"Religion has been in sharp decline in many European countries," study author Ingrid Storm, a researcher at Manchester University, said in a press release. "Each new generation is less religious than the one before, so I was interested to find out if there is any reason to expect moral decline."

Between 1981 to 2008, respondents from 48 European nations shared their attitudes toward a variety of moral and cultural transgressions.

In analyzing the responses, Storm differentiated between two types of moral offenses. The first category encompasses behavior that offends tradition or cultural norms, such as abortion or homosexuality. The second category includes crimes against the state and those harmful to others -- lying, cheating, stealing.

The article is here.

Thursday, January 21, 2016

Intuition, deliberation, and the evolution of cooperation

Adam Bear and David G. Rand
PNAS 2016 : 1517780113v1-201517780.

Abstract

Humans often cooperate with strangers, despite the costs involved. A long tradition of theoretical modeling has sought ultimate evolutionary explanations for this seemingly altruistic behavior. More recently, an entirely separate body of experimental work has begun to investigate cooperation’s proximate cognitive underpinnings using a dual-process framework: Is deliberative self-control necessary to reign in selfish impulses, or does self-interested deliberation restrain an intuitive desire to cooperate? Integrating these ultimate and proximate approaches, we introduce dual-process cognition into a formal game-theoretic model of the evolution of cooperation. Agents play prisoner’s dilemma games, some of which are one-shot and others of which involve reciprocity. They can either respond by using a generalized intuition, which is not sensitive to whether the game is one-shot or reciprocal, or pay a (stochastically varying) cost to deliberate and tailor their strategy to the type of game they are facing. We find that, depending on the level of reciprocity and assortment, selection favors one of two strategies: intuitive defectors who never deliberate, or dual-process agents who intuitively cooperate but sometimes use deliberation to defect in one-shot games. Critically, selection never favors agents who use deliberation to override selfish impulses: Deliberation only serves to undermine cooperation with strangers. Thus, by introducing a formal theoretical framework for exploring cooperation through a dual-process lens, we provide a clear answer regarding the role of deliberation in cooperation based on evolutionary modeling, help to organize a growing body of sometimes-conflicting empirical results, and shed light on the nature of human cognition and social decision making.

The article is here.

The Role of Compassion in Altruistic Helping and Punishment Behavior

Helen Y. Weng, Andrew S. Fox, Heather C. Hessenthaler, Diane E. Stodola, Richard J. Davidson
PLOS One
Published: December 10, 2015
DOI: 10.1371/journal.pone.0143794

Abstract

Compassion, the emotional response of caring for another who is suffering and that results in motivation to relieve suffering, is thought to be an emotional antecedent to altruistic behavior. However, it remains unclear whether compassion enhances altruistic behavior in a uniform way or is specific to sub-types of behavior such as altruistic helping of a victim or altruistic punishment of a transgressor. We investigated the relationship between compassion and subtypes of altruistic behavior using third-party paradigms where participants 1) witnessed an unfair economic exchange between a transgressor and a victim, and 2) had the opportunity to either spend personal funds to either economically a) help the victim or b) punish the transgressor. In Study 1, we examined whether individual differences in self-reported empathic concern (the emotional component of compassion) was associated with greater altruistic helping or punishment behavior in two independent samples. For participants who witnessed an unfair transaction, trait empathic concern was associated with greater helping of a victim and had no relationship to punishment. However, in those who decided to punish the transgressor, participants who reported greater empathic concern decided to punish less. In Study 2, we directly enhanced compassion using short-term online compassion meditation training to examine whether altruistic helping and punishment were increased after two weeks of training. Compared to an active reappraisal training control group, the compassion training group gave more to help the victim and did not differ in punishment of the transgressor. Together, these two studies suggest that compassion is related to greater altruistic helping of victims and is not associated with or may mitigate altruistic punishment of transgressors.

The article is here.

Tuesday, December 8, 2015

'Fallout 4' tackles morality in an interesting way

By Antonio Villa-Boas
Business Insider
Originally published November 19, 2015

Here is an excerpt:

But the companions that accompany you throughout the game have unique perks that give you useful advantages in certain situations. The thing is, you need to gain their respect with specific behaviours and actions if you want access to those perks.

You can check which behaviours each companion prefers, but overall, they prefer that you are not a “bad” person.

Don’t murder innocent people, don’t use drugs, don’t pick locks to places or things that don’t belong to you, don’t pickpocket, and don’t steal. While you can do all those things in the game, you won’t win over most of your companions and you’ll make it harder to access their perks.

The companions turn out to be “Fallout 4’s” moral arbiters!

The entire article is here.

Wednesday, October 7, 2015

Reducing Bounded Ethicality: How to Help Individuals Notice and Avoid Unethical Behavior

By T. Zhang, P. O. Fletcher, F. Gino , and Max Bazerman

Executive Summary

Research on ethics has focused on the factors that help individuals act ethically when they are
tempted to cheat. However, we know little about how best to help individuals notice unethical
behaviors in others and in themselves. This paper identifies a solution: instilling a mindset of
vigilance. In an experiment, individuals playing the role of financial advisers recommended one
of four possible investments to their clients. Unbeknown to these advisers, one of the funds
under consideration was actually a fraudulent feeder fund of Madoff Investment Securities.
Results from this empirical study demonstrate that instilling vigilance by asking individuals to
indicate their suspicions prior to making a decision was critical to helping them notice fraudulent
behavior and act on that information. In contrast, committing to a decision prior to contemplating
suspicions precluded individuals from subsequently integrating critical information about the
fund’s fraudulent activity. We extend these findings to other interventions aimed to help
managers notice unethical behavior.

The entire paper is here.

Friday, August 28, 2015

Deconstructing intent to reconstruct morality

Fiery Cushman
Current Opinion in Psychology
Volume 6, December 2015, Pages 97–103

Highlights

• Mental state inference is a foundational element of moral judgment.
• Its influence is usually captured by contrasting intentional and accidental harm.
• The folk theory of intentional action comprises many distinct elements.
• Moral judgment shows nuanced sensitivity to these constituent elements.
• Future research will profit from attention to the constituents of intentional action.

Mental state representations are a crucial input to human moral judgment. This fact is often summarized by saying that we restrict moral condemnation to ‘intentional’ harms. This simple description is the beginning of a theory, however, not the end of one. There is rich internal structure to the folk concept of intentional action, which comprises a series of causal relations between mental states, actions and states of affairs in the world. Moral judgment shows nuanced patterns of sensitivity to all three of these elements: mental states (like beliefs and desires), the actions that a person performs, and the consequences of those actions. Deconstructing intentional action into its elemental fragments will enable future theories to reconstruct our understanding of moral judgment.

The entire article is here.

Tuesday, August 18, 2015

What Emotions Are (and Aren’t)

By Lisa Feldman Barrett
The New York Times
Originally published July 31, 2015

Here is an excerpt:

Brain regions like the amygdala are certainly important to emotion, but they are neither necessary nor sufficient for it. In general, the workings of the brain are not one-to-one, whereby a given region has a distinct psychological purpose. Instead, a single brain area like the amygdala participates in many different mental events, and many different brain areas are capable of producing the same outcome. Emotions like fear and anger, my lab has found, are constructed by multipurpose brain networks that work together.

If emotions are not distinct neural entities, perhaps they have a distinct bodily pattern — heart rate, respiration, perspiration, temperature and so on?

Again, the answer is no.

The entire article is here.

Monday, August 17, 2015

Hormones and Ethics: Understanding the Biological Basis of Unethical Conduct.

Lee, Jooa Julie, Francesca Gino, Ellie Shuo Jin, Leslie K. Rice, and Robert A. Josephs.
Journal of Experimental Psychology: General (in press).

Abstract


Globally, fraud has been rising sharply over the last decade, with current estimates placing financial losses at greater than $3.7 trillion dollars annually. Unfortunately, fraud prevention has been stymied by lack of a clear and comprehensive understanding of its underlying causes and mechanisms. In this paper, we focus on an important but neglected topic—the biological antecedents and consequences of unethical conduct—using salivary collection of hormones (testosterone and cortisol). We hypothesized that pre-performance cortisol would interact with pre-performance levels of testosterone to regulate cheating behavior in two studies. Further, based on the previously untested cheating-as-stress-reduction hypothesis, we predicted a dose-response relationship between cheating and reductions in cortisol and negative affect. Taken together, this research marks the first foray into the possibility that endocrine system activity plays an important role in the regulation of unethical behavior.

The entire article is here.

Monday, August 3, 2015

Cheeseburger ethics

By Eric Schwitzgebel
Aeon Magazine
Originally published July 15, 2015

Here are two excerpts:

Ethicists do not appear to behave better. Never once have we found ethicists as a whole behaving better than our comparison groups of other professors, by any of our main planned measures. But neither, overall, do they seem to behave worse. (There are some mixed results for secondary measures.) For the most part, ethicists behave no differently from professors of any other sort – logicians, chemists, historians, foreign-language instructors.

(cut)

‘Furthermore,’ she continues, ‘if we demand that ethicists live according to the norms they espouse, that will put major distortive pressures on the field. An ethicist who feels obligated to live as she teaches will be motivated to avoid highly self-sacrificial conclusions, such as that the wealthy should give most of their money to charity or that we should eat only a restricted subset of foods. Disconnecting professional ethicists’ academic enquiries from their personal choices allows them to consider the arguments in a more even-handed way. If no one expects us to act in accord with our scholarly opinions, we are more likely to arrive at the moral truth.’

The entire article is here.

Tuesday, May 12, 2015

Answering 'Why be good?" for a Three-Year-old

By Christian B. Miller
Big Ideas at Slate.com

Here is an excerpt:

I would also mention to my son that the question of, “Why be good?” is especially important because most of us—myself included—are simply not good, morally speaking. We do not have a virtuous or good character. Why do I say that? You might think it is obvious based on watching the nightly news. But my answer is based on hundreds of psychological studies from the last 50 years. In a famous experiment, for instance, Yale psychologist Stanley Milgram found that many people would willingly shock an innocent person, even to the point of death, if pressured from an authority figure. Less well known but also important, are the findings by Lisa Shu of the London Business School. She and her colleagues have found that cheating on tests dramatically increases when it becomes clear to the test-takers that they will not get caught.

So there is a virtuous way to be—honest, compassionate, etc.—and then there is how we tend to actually be, which is not virtuous. Instead our characters are very much a mixed bag, with many good moral tendencies and many bad ones too. Given that most of us are not virtuous people, the question becomes: Why should we bother to try to develop a better character? Why should we care about it? Does developing better character even matter?

The entire article is here.

Sunday, May 10, 2015

How Does Reasoning (Fail to) Contribute to Moral Judgment? Dumbfounding and Disengagement

Frank Hindriks
Ethical Theory and Moral Practice
April 2015, Volume 18, Issue 2, pp 237-250

Abstract

Recent experiments in moral psychology have been taken to imply that moral reasoning only serves to reaffirm prior moral intuitions. More specifically, Jonathan Haidt concludes from his moral dumbfounding experiments, in which people condemn other people’s behavior, that moral reasoning is biased and ineffective, as it rarely makes people change their mind. I present complementary evidence pertaining to self-directed reasoning about what to do. More specifically, Albert Bandura’s experiments concerning moral disengagement reveal that moral reasoning often does contribute effectively to the formation of moral judgments. And such reasoning need not be biased. Once this evidence is taken into account, it becomes clear that both cognition and affect can play a destructive as well as a constructive role in the formation of moral judgments.

The entire paper is here.

Monday, April 6, 2015

How (Un)ethical Are You?

Mahzarin R. Banaji, Max H. Bazerman, & Dolly Chugh
Harvard Business Review
Originally published in 2003

Here is an excerpt:

Bias That Emerges from Unconscious Beliefs

Most fair-minded people strive to judge others according to their merits, but our research shows how often people instead judge according to unconscious stereotypes and attitudes, or “implicit prejudice.” What makes implicit prejudice so common and persistent is that it is rooted in the fundamental mechanics of thought. Early on, we learn to associate things that commonly go together and expect them to inevitably coexist: thunder and rain, for instance, or gray hair and old age. This skill—to perceive and learn from associations—often serves us well.

But, of course, our associations only reflect approximations of the truth; they are rarely applicable to every encounter. Rain doesn’t always accompany thunder, and the young can also go gray. Nonetheless, because we automatically make such associations to help us organize our world, we grow to trust them, and they can blind us to those instances in which the associations are not accurate—when they don’t align with our expectations.

Because implicit prejudice arises from the ordinary and unconscious tendency to make associations, it is distinct from conscious forms of prejudice, such as overt racism or sexism. This distinction explains why people who are free from conscious prejudice may still harbor biases and act accordingly.

The entire article is here.