Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Misconduct. Show all posts
Showing posts with label Misconduct. Show all posts

Monday, October 9, 2023

They Studied Dishonesty. Was Their Work a Lie?

Gideon Lewis-Kraus
The New Yorker
Originally published 30 Sept 23

Here is an excerpt:

Despite a good deal of readily available evidence to the contrary, neoclassical economics took it for granted that humans were rational. Kahneman and Tversky found flaws in this assumption, and built a compendium of our cognitive biases. We rely disproportionately on information that is easily retrieved: a recent news article about a shark attack seems much more relevant than statistics about how rarely such attacks actually occur. Our desires are in flux—we might prefer pizza to hamburgers, and hamburgers to nachos, but nachos to pizza. We are easily led astray by irrelevant details. In one experiment, Kahneman and Tversky described a young woman who had studied philosophy and participated in anti-nuclear demonstrations, then asked a group of participants which inference was more probable: either “Linda is a bank teller” or “Linda is a bank teller and is active in the feminist movement.” More than eighty per cent chose the latter, even though it is a subset of the former. We weren’t Homo economicus; we were giddy and impatient, our thoughts hasty, our actions improvised. Economics tottered.

Behavioral economics emerged for public consumption a generation later, around the time of Ariely’s first book. Where Kahneman and Tversky held that we unconsciously trick ourselves into doing the wrong thing, behavioral economists argued that we might, by the same token, be tricked into doing the right thing. In 2008, Richard Thaler and Cass Sunstein published “Nudge,” which argued for what they called “libertarian paternalism”—the idea that small, benign alterations of our environment might lead to better outcomes. When employees were automatically enrolled in 401(k) programs, twice as many saved for retirement. This simple bureaucratic rearrangement improved a great many lives.

Thaler and Sunstein hoped that libertarian paternalism might offer “a real Third Way—one that can break through some of the least tractable debates in contemporary democracies.” Barack Obama, who hovered above base partisanship, found much to admire in the promise of technocratic tinkering. He restricted his outfit choices mostly to gray or navy suits, based on research into “ego depletion,” or the concept that one might exhaust a given day’s reservoir of decision-making energy. When, in the wake of the 2008 financial crisis, Obama was told that money “framed” as income was more likely to be spent than money framed as wealth, he enacted monthly tax deductions instead of sending out lump-sum stimulus checks. He eventually created a behavioral-sciences team in the White House. (Ariely had once found that our decisions in a restaurant are influenced by whoever orders first; it’s possible that Obama was driven by the fact that David Cameron, in the U.K., was already leaning on a “nudge unit.”)

The nudge, at its best, was modest—even a minor potential benefit at no cost pencilled out. In the Obama years, a pop-up on computers at the Department of Agriculture reminded employees that single-sided printing was a waste, and that advice reduced paper use by six per cent. But as these ideas began to intermingle with those in the adjacent field of social psychology, the reasonable notion that some small changes could have large effects at scale gave way to a vision of individual human beings as almost boundlessly pliable. Even Kahneman was convinced. He told me, “People invented things that shouldn’t have worked, and they were working, and I was enormously impressed by it.” Some of these interventions could be implemented from above. 


Monday, October 2, 2023

Research: How One Bad Employee Can Corrupt a Whole Team

Stephen Dimmock & William Gerken
Harvard Business Review
Originally posted 5 March 2018

Here is an excerpt:

In our research, we wanted to understand just how contagious bad behavior is. To do so, we examined peer effects in misconduct by financial advisors, focusing on mergers between financial advisory firms that each have multiple branches. In these mergers, financial advisors meet new co-workers from one of the branches of the other firm, exposing them to new ideas and behaviors.

We collected an extensive data set using the detailed regulatory filings available for financial advisors. We defined misconduct as customer complaints for which the financial advisor either paid a settlement of at least $10,000 or lost an arbitration decision. We observed when complaints occurred for each financial advisor, as well as for the advisor’s co-workers.

We found that financial advisors are 37% more likely to commit misconduct if they encounter a new co-worker with a history of misconduct. This result implies that misconduct has a social multiplier of 1.59 — meaning that, on average, each case of misconduct results in an additional 0.59 cases of misconduct through peer effects.

However, observing similar behavior among co-workers does not explain why this similarity occurs. Co-workers could behave similarly because of peer effects – in which workers learn behaviors or social norms from each other — but similar behavior could arise because co-workers face the same incentives or because individuals prone to making similar choices naturally choose to work together.

In our research, we wanted to understand how peer effects contribute to the spread of misconduct. We compared financial advisors across different branches of the same firm, because this allowed us to control for the effect of the incentive structure faced by all advisors in the firm. We also focused on changes in co-workers caused by mergers, because this allowed us to remove the effect of advisors choosing their co-workers. As a result, we were able to isolate peer effects.


Here is my summary: 

The article discusses a study that found that even the most honest employees are more likely to commit misconduct if they work alongside a dishonest individual. The study, which was conducted by researchers at the University of California, Irvine, found that financial advisors were 37% more likely to commit misconduct if they encountered a new co-worker with a history of misconduct.

The researchers believe that this is because people are more likely to learn bad behavior than good behavior. When we see someone else getting away with misconduct, it can make us think that it's okay to do the same thing. Additionally, when we're surrounded by people who are behaving badly, it can create a culture of acceptance for misconduct.

Monday, September 4, 2023

Amid Uncertainty About Francesca Gino’s Research, the Many Co-Authors Project Could Provide Clarity

Evan Nesterak
Behavioral Scientist
Originally posted 30 Aug 23

Here are two excerpts:

“The scientific literature must be cleansed of everything that is fraudulent, especially if it involves the work of a leading academic,” the committee wrote. “No more time and money must be wasted on replications or meta-analyses of fabricated data. Researchers’ and especially students’ too rosy view of the discipline, caused by such publications, should be corrected.”

Stapel’s modus operandi was creating fictitious datasets or tampering with existing ones that he would then “analyze” himself, or pass along to other scientists, including graduate students, as if they were real.

“When the fraud was first discovered, limiting the harm it caused for the victims was a matter of urgency,” the committee said. “This was particularly the case for Mr. Stapel’s former Ph.D. students and postdoctoral researchers, whose publications were suddenly becoming worthless.”

Why revisit the decade-old case of Stapel now? 

Because its echoes can be heard in the unfolding case of Harvard Business School Professor Francesca Gino as she faces allegations of data fraud, and her coauthors, colleagues, and the broader scientific community figure out how to respond. Listening to these echoes, especially those of the Stapel committee, helps put the Gino situation, and the efforts to remedy it, in greater perspective.

(cut)

“After a comprehensive evaluation that took 18 months from start to completion, the investigation committee—comprising three senior HBS colleagues—determined that research misconduct had occurred,” his email said. “After reviewing their detailed report carefully, I could come to no other conclusion, and I accepted their findings.”

He added: “I ultimately accepted the investigation committee’s recommended sanctions, which included immediately placing Professor Gino on administrative leave and correcting the scientific record.”

While it is unclear how the lawsuit will play out, many scientists have expressed concern about the chilling effects it might have on scientists’ willingness to come forward if they suspect research misconduct. 

“If the data are not fraudulent, you ought to be able to show that. If they are, but the fraud was done by someone else, name the person. Suing individual researchers for tens of millions of dollars is a brazen attempt to silence legitimate scientific criticism,” psychologist Yoel Inbar commented on Gino’s statement on Linkedin. 

It is this sentiment that led 13 behavioral scientists (some of whom have coauthored with Gino) to create a GoFundMe campaign on behalf of Simonsohn, Simmons, and Nelson to help raise money for their legal defense. 

Monday, August 2, 2021

Landmark research integrity survey finds questionable practices are surprisingly common

Jop De Vrieze
Science Magazine
Originally posted 7 Jul 21

More than half of Dutch scientists regularly engage in questionable research practices, such as hiding flaws in their research design or selectively citing literature, according to a new study. And one in 12 admitted to committing a more serious form of research misconduct within the past 3 years: the fabrication or falsification of research results.

This rate of 8% for outright fraud was more than double that reported in previous studies. Organizers of the Dutch National Survey on Research Integrity, the largest of its kind to date, took special precautions to guarantee the anonymity of respondents for these sensitive questions, says Gowri Gopalakrishna, the survey’s leader and an epidemiologist at Amsterdam University Medical Center (AUMC). “That method increases the honesty of the answers,” she says. “So we have good reason to believe that our outcome is closer to reality than that of previous studies.” The survey team published results on 6 July in two preprint articles, which also examine factors that contribute to research misconduct, on MetaArxiv.

When the survey began last year, organizers invited more than 60,000 researchers to take part—those working across all fields of research, both science and the humanities, at some 22 Dutch universities and research centers. However, many institutions refused to cooperate for fear of negative publicity, and responses fell short of expectations: Only about 6800 completed surveys were received. Still, that’s more responses than any previous research integrity survey, and the response rate at the participating universities was 21%—in line with previous surveys.

One of the preprints focuses on the prevalence of misbehavior—cases of fraud as well as a less severe category of “questionable research practices,” such as carelessly assessing the work of colleagues, poorly mentoring junior researchers, or selectively citing scientific literature. The other article focuses on responsible behavior; this includes correcting one’s own published errors, sharing research data, and “preregistering” experiments—posting hypotheses and protocols ahead of time to reduce the bias that can arise when these are released after data collection.

Friday, December 20, 2019

Can Ethics be Taught? Evidence from Securities Exams and Investment Adviser Misconduct

Kowaleski, Z., Sutherland, A. and Vetter, F.
Available at SSRN
Posted 10 Oct 19

Abstract

We study the consequences of a 2010 change in the investment adviser qualification exam that
reallocated coverage from the rules and ethics section to the technical material section. Comparing advisers with the same employer in the same location and year, we find those passing the exam with more rules and ethics coverage are one-fourth less likely to commit misconduct. The exam change appears to affect advisers’ perception of acceptable conduct, and not just their awareness of specific rules or selection into the qualification. Those passing the rules and ethics-focused exam are more likely to depart employers experiencing scandals. Such departures also predict future scandals. Our paper offers the first archival evidence on how rules and ethics training affects conduct and labor market activity in the financial sector.

From the Conclusion

Overall, our results can be understood through the lens of Becker’s model of crime (1968, 1992). In this model, “many people are constrained by moral and ethical considerations, and did not commit crimes even when they were profitable and there was no danger of detection… The amount of crime is determined not only by the rationality and preferences of would-be criminals, but also by the economic and social environment created by… opportunities for employment, schooling, and training programs.” (Becker 1992, pp. 41-42). In our context, ethics training can affect an individual’s behavior by increasing the value of their reputation, as well as the psychological costs of committing misconduct. But such effects will be moderated by the employer’s culture, which affects the stigma of offenses, as well as the individual’s beliefs about appropriate conduct.

The research is here.

Tuesday, July 16, 2019

Experts Recommend SCOTUS Adopt Code of Ethics to Promote Accountability

Jerry Lambe
www.lawandcrime.com
Originally posted June 24, 2019


Here is an excerpt:

While the high court’s justices must already abide by an ethical code, many of the experts noted that the current code does not sufficiently address modern ethical standards.

“The impartiality of our judiciary should be beyond reproach, so having a basic ethics code for its members to follow is a natural outgrowth of that common value, one that should be no less rigorously applied to our nation’s highest court,” Roth testified.

He added that disclosures from the court are particularly opaque, especially when sought out by the general public.

“To the outside observer, the current protocol makes it seem as if the judiciary is hiding something. […] With members of the judiciary already filling out and filing their reports digitally, the public should obtain them the same way, without having my organization act as the middleman,” Roth said.

Frost told the subcommittee that holding a hearing on the topic was a good first step in the process.

“Part of what I care about is not just the reality of impartial and fair justice, but the public’s perception of the courts,” she said, adding, “There have been signals by the justices that the court is considering rethinking adopting a code of ethics.”

The info is here.

Tuesday, July 9, 2019

A Waste of 1,000 Research Papers

Ed Yong
The Atlantic
Originally posted May 17, 2019

In 1996, a group of European researchers found that a certain gene, called SLC6A4, might influence a person’s risk of depression.

It was a blockbuster discovery at the time. The team found that a less active version of the gene was more common among 454 people who had mood disorders than in 570 who did not. In theory, anyone who had this particular gene variant could be at higher risk for depression, and that finding, they said, might help in diagnosing such disorders, assessing suicidal behavior, or even predicting a person’s response to antidepressants.

Back then, tools for sequencing DNA weren’t as cheap or powerful as they are today. When researchers wanted to work out which genes might affect a disease or trait, they made educated guesses, and picked likely “candidate genes.” For depression, SLC6A4 seemed like a great candidate: It’s responsible for getting a chemical called serotonin into brain cells, and serotonin had already been linked to mood and depression. Over two decades, this one gene inspired at least 450 research papers.

But a new study—the biggest and most comprehensive of its kind yet—shows that this seemingly sturdy mountain of research is actually a house of cards, built on nonexistent foundations.

Richard Border of the University of Colorado at Boulder and his colleagues picked the 18 candidate genes that have been most commonly linked to depression—SLC6A4 chief among them. Using data from large groups of volunteers, ranging from 62,000 to 443,000 people, the team checked whether any versions of these genes were more common among people with depression. “We didn’t find a smidge of evidence,” says Matthew Keller, who led the project.

The info is here.

Friday, March 22, 2019

We need to talk about systematic fraud

Jennifer Byrne
Nature 566, 9 (2019)
doi: 10.1038/d41586-019-00439-9

Here is an excerpt:

Some might argue that my efforts are inconsequential, and that the publication of potentially fraudulent papers in low-impact journals doesn’t matter. In my view, we can’t afford to accept this argument. Such papers claim to uncover mechanisms behind a swathe of cancers and rare diseases. They could derail efforts to identify easily measurable biomarkers for use in predicting disease outcomes or whether a drug will work. Anyone trying to build on any aspect of this sort of work would be wasting time, specimens and grant money. Yet, when I have raised the issue, I have had comments such as “ah yes, you’re working on that fraud business”, almost as a way of closing down discussion. Occasionally, people’s reactions suggest that ferreting out problems in the literature is a frivolous activity, done for personal amusement, or that it is vindictive, pursued to bring down papers and their authors.

Why is there such enthusiasm for talking about faulty research practices, yet such reluctance to discuss deliberate deception? An analysis of the Diederik Stapel fraud case that rocked the psychology community in 2011 has given me some ideas (W. Stroebe et al. Perspect. Psychol. Sci. 7, 670–688; 2012). Fraud departs from community norms, so scientists do not want to think about it, let alone talk about it. It is even more uncomfortable to think about organized fraud that is so frequently associated with one country. This becomes a vicious cycle: because fraud is not discussed, people don’t learn about it, so they don’t consider it, or they think it’s so rare that it’s unlikely to affect them, and so papers are less likely to come under scrutiny. Thinking and talking about systematic fraud is essential to solving this problem. Raising awareness and the risk of detection may well prompt new ways to identify papers produced by systematic fraud.

Last year, China announced sweeping plans to curb research misconduct. That’s a great first step. Next should be a review of publication quotas and cash rewards, and the closure of ‘paper factories’.

The info is here.

Monday, February 18, 2019

State ethics director resigns after porn, misconduct allegations

Richard Belcher
WSB-TV2
Originally published February 8, 2019

The director of the state Ethics Commission has resigned -- with a $45,000 severance -- and it’s still unknown whether accusations against him have been substantiated.

In January, Channel 2 Action News and The Atlanta Journal-Constitution broke the story that staff members at the Ethics Commission wrote letters accusing Stefan Ritter of poor work habits and of watching pornography in the office.

Ritter was placed on leave with pay to allow time to investigate the complaints.

Ritter continued to draw his $181,000 salary while the accusations against him were investigated, but he and the commission cut a deal before the investigation was over.

The info is here.

Thursday, May 10, 2018

A Two-Factor Model of Ethical Culture

Caterina Bulgarella
ethicalsystems.org

Making Progress in the Field of Business Ethics

Over the past 15 years, behavioral science has provided practitioners with a uniquely insightful
perspective on the organizational elements companies need to focus on to build an ethical culture.
Pieced together, this research can be used to address the growing challenges business must tackle
today.

Faced with unprecedented complexity and rapid change, more and more organizations are feeling the
limitations of an old-fashioned approach to ethics. In this new landscape, the importance of a proactive ethical stance has become increasingly clear. Not only is a strong focus on business integrity likely to reduce the costs of misconduct, but it can afford companies a solid corporate reputation, genuine employee compliance, robust governance, and even increased profitability.

The need for a smarter, deeper, and more holistic approach to ethical conduct is also strengthened by
the inherent complexity of human behavior. As research continues to shed light on the factors that
undermine people’s ability to ‘do the right thing,’ we are reminded of how difficult it is to solve for
ethics without addressing the larger challenge of organizational culture.

The components that shape the culture of an organization exercise a constant and unrelenting influence on how employees process information, make decisions, and, ultimately, respond to ethical dilemmas.  This is why, in order to help business achieve a deeper and more systematic ethical focus, we must understand the ingredients that make up an ethical culture.

The information is here.

Wednesday, May 2, 2018

Institutional Research Misconduct Reports Need More Credibility

Gunsalus CK, Marcus AR, Oransky I.
JAMA. 2018;319(13):1315–1316.
doi:10.1001/jama.2018.0358

Institutions have a central role in protecting the integrity of research. They employ researchers, own the facilities where the work is conducted, receive grant funding, and teach many students about the research process. When questions arise about research misconduct associated with published articles, scientists and journal editors usually first ask the researchers’ institution to investigate the allegations and then report the outcomes, under defined circumstances, to federal oversight agencies and other entities, including journals.

Depending on institutions to investigate their own faculty presents significant challenges. Misconduct reports, the mandated product of institutional investigations for which US federal dollars have been spent, have a wide range of problems. These include lack of standardization, inherent conflicts of interest that must be addressed to directly ensure credibility, little quality control or peer review, and limited oversight. Even when institutions act, the information they release to the public is often limited and unhelpful.

As a result, like most elements of research misconduct, little is known about institutions’ responses to potential misconduct by their own members. The community that relies on the integrity of university research does not have access to information about how often such claims arise, or how they are resolved. Nonetheless, there are some indications that many internal reviews are deficient.

The article is here.

Tuesday, April 17, 2018

Building A More Ethical Workplace Culture

PYMNTS
PYMNTS.com
Originally posted March 20, 2018

Here is an excerpt:

The Worst News

Among the positive findings in the report was the fact that reporting is on the rise by a whole 19 percent, with 69 percent of employees stating they had reported misconduct in the last two years.

But that number, Harned said, comes with a bitter side note. Retaliation has also spiked during the same time period, with 44 percent reporting it – up from 22 percent two years ago.

The rate of retaliation going up faster than the rate of reporting, Harned noted, is disturbing.

“That is a very real problem for employees, and I think over the last year, we’ve seen what a huge problem it has become for employers.”

The door-to-door on retaliation for reporting is short – about three weeks on average. That is just about the time it takes for firms – even those serious about doing a good job with improving compliance – to get any investigation up and organized.

“By then, the damage is already done,” said Harned. “We are better at seeing misconduct, but we aren’t doing enough to prevent it from happening – especially because retaliation is such a big problem.”

There are not easy solutions, Harned noted, but the good news – even in the face of the worst news – is that improvement is possible, and is even being logged in some segments. Employees, she stated, mostly come in the door with a moral compass to call their own, and want to work in environments that are healthy, not vicious.

“The answer is culture is everything: Companies need to constantly communicate to employees that conduct is the expectation for all levels of the organization, and that breaking those rules will always have consequences.”

The post is here.

Wednesday, April 11, 2018

How One Bad Employee Can Corrupt a Whole Team

Stephen Dimmock and William C. Gerken
Harvard Business Review
Originally posted March 5, 2018

One bad apple, the saying goes, can ruin the bunch. So, too, with employees.

Our research on the contagiousness of employee fraud tells us that even your most honest employees become more likely to commit misconduct if they work alongside a dishonest individual. And while it would be nice to think that the honest employees would prompt the dishonest employees to better choices, that’s rarely the case.

Among co-workers, it appears easier to learn bad behavior than good.

For managers, it is important to realize that the costs of a problematic employee go beyond the direct effects of that employee’s actions — bad behaviors of one employee spill over into the behaviors of other employees through peer effects. By under-appreciating these spillover effects, a few malignant employees can infect an otherwise healthy corporate culture.

History — and current events — are littered with outbreaks of misconduct among co-workers: mortgage underwriters leading up to the financial crisis, stock brokers at boiler rooms such as Stratton Oakmont, and cross-selling by salespeople at Wells Fargo.

The information is here.

Monday, January 15, 2018

The media needs to do more to elevate a national conversation about ethics

Arthur Caplan
Poynter.com
Originally December 21, 2017

Here is an excerpt:

Obviously unethical conduct has been around forever and will be into the foreseeable future. That said, it is important that the leaders of this nation and, more importantly, those leading our key institutions and professions reaffirm their commitment to the view that there are higher values worth pursuing in a just society. The fact that so many fail to live up to basic values does not mean that the values are meaningless, wrong or misplaced. They aren’t. It is rather that the organizations and professions where the epidemic of moral failure is burgeoning have put other values, often power and profits, ahead of morality.

There is no simple fix for hypocrisy. Egoism, the gross abuse of power and self-indulgence, is a very tough moral opponent in an individualistic society like America. Short-term reward is deceptively more attractive then slogging out the virtues in the name of the long haul. If we are to prepare our children to succeed, then attending to their moral development is as important as anything we can do. If our leaders are to truly lead then we have to reward those who do, not those who don’t, won’t or can’t. Are we?

The article is here.

Monday, January 8, 2018

Advocacy group raises concerns about psychological evaluations on hundreds of defendants

Keith L. Alexander
The Washington Post
Originally published December 14, 2017

A District employee who has conducted mental evaluations on hundreds of criminal defendants as a forensic psychologist has been removed from that role after concerns surfaced about her educational qualifications, according to city officials.

Officials with the District’s Department of Health said Reston N. Bell was not qualified to conduct the assessments without the help or review of a supervisor. The city said it had mistakenly granted Bell, who was hired in 2016, a license to practice psychology, but this month the license was downgraded to “psychology associate.”

Although Bell has a master’s degree in psychology and a doctorate in education, she does not have a PhD in psychology, which led to the downgrade.

The article is here.

Monday, April 25, 2016

The Strict Liability Standard and Clinical Supervision

Paul D. Polychronis & Steven G. Brown
Professional Psychology: Research and Practice, Vol 47(2), Apr 2016, 139-146.

Abstract

Clinical supervision is essential to both the training of new psychologists and the viability of professional psychology. It is also a high-risk endeavor for clinical supervisors because of regulations in many states that impose a strict liability standard on supervisors for supervisees’ conduct. Applied in the context of tort law, the concept of strict liability makes supervisors responsible for supervisees’ actions without having to establish that a given supervisor was negligent or careless. Consequently, in jurisdictions where the strict liability standard is used, it is virtually inevitable that clinical supervisors will be named in civil suits over a supervisee’s actions regardless of whether a supervisor has been appropriately conscientious. In cases of supervisee misconduct, regulations in 27 of 51 jurisdictions (the 50 states plus the District of Columbia) generally hold clinical supervisors fully responsible for supervisees’ actions in a professional realm regardless of the nature of the supervisees’ misbehavior. Some examples are provided of language from these state regulations. The implications of this current reality are discussed. Altering the regulatory approach to clinical supervision is explored to reduce risk to clinical supervisors that is beyond their reasonable control. Recommendations for conducting clinical supervision and corresponding risk-management practices are suggested to assist clinicians in protecting themselves if practicing in a jurisdiction that uses the strict liability standard in regulations governing clinical supervision.

The article is here.

Sunday, February 9, 2014

Strengthening the Ethical Culture of Your Organization Should Be a Priority

By Barbara Richman
SPHR

According to the 2011 National Business Ethics Survey, a report published by the Ethics Resource Center, the ethical culture of the American workplace is in transition. The survey, the seventh since 1994, was conducted for the purpose of understanding how employees at all levels view ethics and compliance at work.

Its overall results send mixed signals to employers. While positive indicators are included in the findings, they are clouded by “ominous warning signs of a potentially significant ethics decline ahead.”

On the positive side, the data revealed historically low levels of misconduct in the American workplace and near record high levels of employees reporting misconduct that they observed. On the negative side, however, there was a sharp rise in retaliation against employee whistleblowers, an increase in the percentage of employees who perceived pressure to compromise standards in order to do their jobs, and near record levels of companies with weak ethical cultures.

The entire story is here.

Friday, September 6, 2013

Dangerous Doctors Allowed to Keep Practicing

By Peter Eisler and Barbara Hansen
USA Today
Originally published August 20, 2013

Here is an excerpt:

Despite years of criticism, the nation's state medical boards continue to allow thousands of physicians to keep practicing medicine after findings of serious misconduct that puts patients at risk, a USA TODAY investigation shows. Many of the doctors have been barred by hospitals or other medical facilities; hundreds have paid millions of dollars to resolve malpractice claims. Yet their medical licenses — and their ability to inflict harm — remain intact.

The problem isn't universal. Some state boards have responded to complaints and become more transparent and aggressive in policing bad doctors.

But state and federal records still paint a grim picture of a physician oversight system that often is slow to act, quick to excuse problems, and struggling to manage workloads in an era of tight state budgets.

USA TODAY reviewed records from multiple sources, including the public file of the National Practitioner Data Bank, a federal repository set up to help medical boards track physicians' license records, malpractice payments, and disciplinary actions imposed by hospitals, HMOs and other institutions that manage doctors. By law, reports must be filed with the Data Bank when any of the nation's 878,000 licensed doctors face "adverse actions" — and the reports are intended to be monitored closely by medical boards.

The entire narrative and video story is here.

Sunday, July 7, 2013

Certain Age Groups May Encounter More Ethics Risk, Says New Report from the Ethics Resource Center

Ethics Resource Center
Press Release
Originally presented on June 24, 2013

Younger workers are more susceptible to experiencing ethical dilemmas on the job, the Ethics Resource Center (ERC) said today in “Generational Differences in Workplace Ethics,” a supplemental research report to their 2011 National Business Ethics Survey®.  The new report takes an in-depth look at how employees of different generational cohorts are shaping today’s workplace.

The report delves into trends among four specific generational groups- Millenials, Generation X (Gen X’ers), Boomers, and Traditionalists. Each generation, shaped by significant world events and cultural trends, exhibits distinct differences when it comes to ethics.  According to the study, certain age groups are more “at risk” than others when it comes to the four key measures of ethical performance- pressure to compromise standards, misconduct, reporting, and retaliation.  For instance, the report reveals that the younger the worker, the more likely they are to feel pressure, observe misconduct, and experience retaliation for reporting.

Major findings from the survey include:

  • Almost half of Millenials (49 percent) observed workplace misconduct
  • The youngest workers (29 percent) were significantly more likely to experience retaliation than Gen X’ers (21 percent) and Baby Boomers (18 percent)
  • After witnessing misconduct, over half of employees in every age group reported it to their supervisor first

“It is important for companies to realize that each generation perceives ethics and culture differently from the others,” said ERC’s President, Dr. Patricia J. Harned.  “However, business leaders should know they do not have to completely redesign their ethics and compliance programs.  Implementing an effective ethics and compliance program and building a strong ethics culture will continue to make a difference for all employees. The key is communicating their commitment to ethics differently for different generations.”

This study is the most recent in a series of surveys conducted by the ERC. The ERC has fielded a biennial National Business Ethics Survey (NBES®) since 1994, providing business leaders a snapshot of workplace ethics trends.  Throughout the years the NBES has been expanded into a series, making it possible to focus on specific areas of interest.

“Generational Differences” allows ERC to address challenges facing a workforce spanning multiple generations, and offers suggestions for business leaders on how to reach each generation.  This newest report was made possible in part by a generous contribution from Raytheon Company.

Download the entire supplemental report here.

Friday, May 18, 2012

Court of Appeal Says Psychologist Can Be Disciplined For Misconduct as Family Law Special Master

By MetNews Staff Writer
Metropolitan News-Enterprise
Originally Published May 11, 2012

The state Board of Psychology properly disciplined a licensee for unethical conduct while serving as a special master in a contentious family law case, the Third District Court of Appeal ruled yesterday.

The justices affirmed Sacramento Superior Court Judge Patrick Marlette’s denial of Dr. Randy Rand’s petition for writ of mandate. Rand was challenging the board’s order placing him on probation for five years, based on findings of unprofessional conduct, gross negligence, violation of statutes governing the practice of psychology, and dishonesty.

The entire article is here.

Thanks to Ken Pope for this information.