Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Social Pressure. Show all posts
Showing posts with label Social Pressure. Show all posts

Saturday, August 5, 2023

Cheap promises: Evidence from loan repayment pledges in an online experiment

Bhanot, S. P. (2017).
Journal of Economic Behavior & 
Organization, 142, 250-261.

Abstract

Across domains, people struggle to follow through on their commitments. This can happen for many reasons, including dishonesty, forgetfulness, or insufficient intrinsic motivation. Social scientists have explored the reasons for persistent failures to follow through, suggesting that eliciting explicit promises can be an effective way to motivate action. This paper presents a field experiment that tests the effect of explicit promises, in the form of “honor pledges,” on loan repayment rates. The experiment was conducted with LendUp, an online lender, and targeted 4,883 first-time borrowers with the firm. Individuals were randomized into four groups, with the following experimental treatments: (1) having no honor pledge to complete (control); (2) signing a given honor pledge; (3) re-typing the same honor pledge as in (2) before signing; and (4) coming up with a personal honor pledge to type and sign. I also randomized whether or not borrowers were reminded of the honor pledge they signed prior to the repayment deadline. The results suggest that the honor pledge treatments had minimal impacts on repayment, and that reminders of the pledges were similarly ineffective. This suggests that borrowers who fail to repay loans do so not because of dishonesty or behavioral biases, but because they suffer from true financial hardship and are simply unable to repay.

Discussion

Literature in experimental economics and psychology often finds impacts of promises and explicit honor pledges on behavior, and in particular on reducing dishonest behavior. However, the results of this field experiment suggest no meaningful effects from an explicit promise (and indeed, a salient promise) on loan repayment behavior in a real-world setting, with money at stake. Furthermore, a self-written honor pledge was no more efficacious than any other, and altering the salience of the honor pledge, both at loan initiation and in reminder emails, had negligible impacts on outcomes. In other words, I find no evidence for the hypotheses that salience, reminders, or personalization strengthen the impact of a promise on behavior.  Indeed, the results of the study suggest that online loan repayment is a domain where such behavioral tools do not have an impact on decisions. This is a significant result, because it provides insights into why borrowers might fail to repay loans; most notably, it suggests that the failure to repay short-term loans may not be a question of dishonest behavior or behavioral biases, but rather an indication of true financial hardship. Simply put, when repayment is not financially possible, framing, reminders, or other interventions utilizing behavioral science are of limited use.

Thursday, July 12, 2018

Learning moral values: Another's desire to punish enhances one's own punitive behavior

FeldmanHall O, Otto AR, Phelps EA.
J Exp Psychol Gen. 2018 Jun 7. doi: 10.1037/xge0000405.

Abstract

There is little consensus about how moral values are learned. Using a novel social learning task, we examine whether vicarious learning impacts moral values-specifically fairness preferences-during decisions to restore justice. In both laboratory and Internet-based experimental settings, we employ a dyadic justice game where participants receive unfair splits of money from another player and respond resoundingly to the fairness violations by exhibiting robust nonpunitive, compensatory behavior (baseline behavior). In a subsequent learning phase, participants are tasked with responding to fairness violations on behalf of another participant (a receiver) and are given explicit trial-by-trial feedback about the receiver's fairness preferences (e.g., whether they prefer punishment as a means of restoring justice). This allows participants to update their decisions in accordance with the receiver's feedback (learning behavior). In a final test phase, participants again directly experience fairness violations. After learning about a receiver who prefers highly punitive measures, participants significantly enhance their own endorsement of punishment during the test phase compared with baseline. Computational learning models illustrate the acquisition of these moral values is governed by a reinforcement mechanism, revealing it takes as little as being exposed to the preferences of a single individual to shift one's own desire for punishment when responding to fairness violations. Together this suggests that even in the absence of explicit social pressure, fairness preferences are highly labile.

The research is here.

Wednesday, February 1, 2017

Why It’s So Hard to Train Someone to Make an Ethical Decision

Eugene Soltes
Harvard Business Review
Originally posted January 11, 2017

Here is an excerpt:

The second factor distinguishing training exercises from real-life decision making is that training inevitably exposes different points of views and judgments. Although many organizations outwardly express a desire for a diversity of opinions, in practice those differing viewpoints are often stifled by the desire to agree or appease others. Even at the most senior levels of the organization, independent directors struggle to dissent. For instance, Dennis Kozlowski, the former CEO of Tyco who grew the firm from obscurity into a global conglomerate but later faced criminal charges for embezzlement, recalled the challenge of board members genuinely disagreeing and pushing back on him as CEO when the firm was performing well. “When the CEO is in the room, directors — even independent directors — tend to want to try to please him,” Kozlowski explained. “The board would give me anything I wanted. Anything.”

Finally, unlike in training, when a single decision might be given an hour of careful analysis, most actual decisions are made quickly and rely on intuition rather than careful, reflective reasoning. This can be especially problematic for moral decisions, which often rely on routine and intuitions that produce mindless judgements that don’t match up with how we’d desire to respond if we considered the decision with more time.

The article is here.

Editor's note: While I agree that it can be difficult to teach someone to make an ethical decision, maybe we can develop alternative ways to teach ethical decision-making. Ethics education requires attention to how personal values blend with work responsibilities, emotional reactions to ethical dilemmas, and biases and heuristics related to decision-making skills in general, and ethics in particular.  If an individual feels pressure to make a decision, then there are typically ways to slow down the process.  Finally, ethics education can include quality enhancement strategies, including redundant protections and consultation, that can bolster the opportunity for better outcomes.

Thursday, January 12, 2017

The Psychology of White-Collar Criminals

Eugene Soltes
The Atlantic
Originally posted December 14, 2016

Here is an excerpt:

Usually, a gut feeling that something will be harmful is enough of a deterrence. But when the harm is distant or abstract, this internal alarm doesn’t always go off. This absence of intuition about the harm creates a particular challenge for executives. Today, managerial decisions impact ever-greater numbers of people and the distance between executives and the people their decisions affect continues to grow. In fact, many of the people most harmed or helped by executives’ decisions are those they will never identify or meet. In this less intimate world, age-old intuitions are not always well suited to sense the kinds of potential harms that people can cause in the business world.

Reflecting on these limits to human intuition, I came to a conclusion that I found humbling. Most people like to think that they have the right values to make it through difficult times without falling prey to the same failures as the convicted executives I got to know. But those who believe they would face the same situations with their current values and viewpoints tend to underestimate the influence of the pressures, cultures, and norms that surround executive decision making. Perhaps a little humility is in order, given that people seem to have some difficulty predicting how they’d act in that environment. “What we all think is, ‘When the big moral challenge comes, I will rise to the occasion,’ [but] there’s not actually that many of us that will actually rise to the occasion,” as one former CFO put it. “I didn’t realize I would be a felon.”

The article is here.

Monday, February 24, 2014

Would You Lie for Me?

By Vanessa K. Bohns
The New York Times Sunday Review
Originally published February 7, 2014

Here is an excerpt:

Countless studies have subsequently shown that we find it similarly difficult to resist social pressure from peers, friends and colleagues. Our decisions regarding everything from whether to turn the lights off when we leave a room to whether to call in sick to take a day off from work are affected by the actions and opinions of our neighbors and colleagues.

But what about those times when we are the ones trying to get someone to act unethically? Do we realize how much power we wield with a simple request, suggestion or dare? New research by my students and me suggests that we don’t.

The entire article is here.

The research article is here.

Abstract

We examined the psychology of “instigators,” people who surround an unethical act and influence the wrongdoer (the “actor”) without directly committing the act themselves. In four studies, we found that instigators of unethical acts underestimated their influence over actors. In Studies 1 and 2, university students enlisted other students to commit a “white lie” (Study 1) or commit a small act of vandalism (Study 2) after making predictions about how easy it would be to get their fellow students to do so. In Studies 3 and 4, online samples of participants responded to hypothetical vignettes, for example, about buying children alcohol and taking office supplies home for personal use. In all four studies, instigators failed to recognize the social pressure they levied on actors through simple unethical suggestions, that is, the discomfort actors would experience by making a decision that was inconsistent with the instigator’s suggestion.