Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label denial of care. Show all posts
Showing posts with label denial of care. Show all posts

Sunday, January 21, 2024

Doctors With Histories of Big Malpractice Settlements Now Work for Insurers

P. Rucker, D. Armstrong, & D. Burke
Propublica.org
Originally published 15 Dec 23

Here is an excerpt:

Patients and the doctors who treat them don’t get to pick which medical director reviews their case. An anesthesiologist working for an insurer can overrule a patient’s oncologist. In other cases, the medical director might be a doctor like Kasemsap who has left clinical practice after multiple accusations of negligence.

As part of a yearlong series about how health plans refuse to pay for care, ProPublica and The Capitol Forum set out to examine who insurers picked for such important jobs.

Reporters could not find any comprehensive database of doctors working for insurance companies or any public listings by the insurers who employ them. Many health plans also farm out medical reviews to other companies that employ their own doctors. ProPublica and The Capitol Forum identified medical directors through regulatory filings, LinkedIn profiles, lawsuits and interviews with insurance industry insiders. Reporters then checked those names against malpractice databases, state licensing board actions and court filings in 17 states.

Among the findings: The Capitol Forum and ProPublica identified 12 insurance company doctors with either a history of multiple malpractice payments, a single payment in excess of $1 million or a disciplinary action by a state medical board.

One medical director settled malpractice cases with 11 patients, some of whom alleged he bungled their urology surgeries and left them incontinent. Another was reprimanded by a state medical board for behavior that it found to be deceptive and dishonest. A third settled a malpractice case for $1.8 million after failing to identify cancerous cells on a pathology slide, which delayed a diagnosis for a 27-year-old mother of two, who died less than a year after her cancer was finally discovered.

None of this would have been easily visible to patients seeking approvals for care or payment from insurers who relied on these medical directors.


The ethical implications in this article are staggering.  Here are some quick points:

Conflicted Care: In a concerning trend, some US insurers are employing doctors with past malpractice settlements to assess whether patients deserve coverage for recommended treatments.  So, do these still licensed reviewers actually understand best practices?

Financial Bias: Critics fear these doctors, having faced financial repercussions for past care decisions, might prioritize minimizing payouts over patient needs, potentially leading to denied claims and delayed care.  In other words, do the reviewers have an inherent bias against patients, given that former patients complained against them?

Transparency Concerns: The lack of clear disclosure about these doctors' backgrounds raises concerns about transparency and potential conflicts of interest within the healthcare system.

In essence, this is a horrible system to provide high quality medical review.

Tuesday, December 12, 2023

Health Insurers Have Been Breaking State Laws for Years

Maya Miller and Robin Fields
ProPublic.org
Originally published 16, NOV 23

Here is an excerpt:

State insurance departments are responsible for enforcing these laws, but many are ill-equipped to do so, researchers, consumer advocates and even some regulators say. These agencies oversee all types of insurance, including plans covering cars, homes and people’s health. Yet they employed less people last year than they did a decade ago. Their first priority is making sure plans remain solvent; protecting consumers from unlawful denials often takes a backseat.

“They just honestly don’t have the resources to do the type of auditing that we would need,” said Sara McMenamin, an associate professor of public health at the University of California, San Diego, who has been studying the implementation of state mandates.

Agencies often don’t investigate health insurance denials unless policyholders or their families complain. But denials can arrive at the worst moments of people’s lives, when they have little energy to wrangle with bureaucracy. People with plans purchased on HealthCare.gov appealed less than 1% of the time, one study found.

ProPublica surveyed every state’s insurance agency and identified just 45 enforcement actions since 2018 involving denials that have violated coverage mandates. Regulators sometimes treat consumer complaints as one-offs, forcing an insurer to pay for that individual’s treatment without addressing whether a broader group has faced similar wrongful denials.

When regulators have decided to dig deeper, they’ve found that a single complaint is emblematic of a systemic issue impacting thousands of people.

In 2017, a woman complained to Maine’s insurance regulator, saying her carrier, Aetna, broke state law by incorrectly processing claims and overcharging her for services related to the birth of her child. After being contacted by the state, Aetna acknowledged the mistake and issued a refund.


Here's my take:

The article explores the ethical issues surrounding health insurance denials and the violation of state laws. The investigation reveals a pattern of health insurance companies systematically denying coverage for medically necessary treatments, even when such denials directly contravene state laws designed to protect patients. The unethical practices extend to various states, indicating a systemic problem within the industry. Patients are often left in precarious situations, facing financial burdens and health risks due to the denial of essential medical services, raising questions about the industry's commitment to prioritizing patient well-being over profit margins.

The article underscores the need for increased regulatory scrutiny and enforcement to hold health insurance companies accountable for violating state laws and compromising patient care. It highlights the ethical imperative for insurers to prioritize their fundamental responsibility to provide coverage for necessary medical treatments and adhere to the legal frameworks in place to safeguard patient rights. The investigation sheds light on the intersection of profit motives and ethical considerations within the health insurance industry, emphasizing the urgency of addressing these systemic issues to ensure that patients receive the care they require without undue financial or health-related consequences.

Friday, November 24, 2023

UnitedHealth faces class action lawsuit over algorithmic care denials in Medicare Advantage plans

Casey Ross and Bob Herman
Statnews.com
Originally posted 14 Nov 23

A class action lawsuit was filed Tuesday against UnitedHealth Group and a subsidiary alleging that they are illegally using an algorithm to deny rehabilitation care to seriously ill patients, even though the companies know the algorithm has a high error rate.

The class action suit, filed on behalf of deceased patients who had a UnitedHealthcare Medicare Advantage plan and their families by the California-based Clarkson Law Firm, follows the publication of a STAT investigation Tuesday. The investigation, cited by the lawsuit, found UnitedHealth pressured medical employees to follow an algorithm, which predicts a patient’s length of stay, to issue payment denials to people with Medicare Advantage plans. Internal documents revealed that managers within the company set a goal for clinical employees to keep patients rehab stays within 1% of the days projected by the algorithm.

The lawsuit, filed in the U.S. District Court of Minnesota, accuses UnitedHealth and its subsidiary, NaviHealth, of using the computer algorithm to “systematically deny claims” of Medicare beneficiaries struggling to recover from debilitating illnesses in nursing homes. The suit also cites STAT’s previous reporting on the issue.

“The fraudulent scheme affords defendants a clear financial windfall in the form of policy premiums without having to pay for promised care,” the complaint alleges. “The elderly are prematurely kicked out of care facilities nationwide or forced to deplete family savings to continue receiving necessary care, all because an [artificial intelligence] model ‘disagrees’ with their real live doctors’ recommendations.”


Here are some of my concerns:

The use of algorithms in healthcare decision-making has raised a number of ethical concerns. Some critics argue that algorithms can be biased and discriminatory, and that they can lead to decisions that are not in the best interests of patients. Others argue that algorithms can lack transparency, and that they can make it difficult for patients to understand how decisions are being made.

The lawsuit against UnitedHealth raises a number of specific ethical concerns. First, the plaintiffs allege that UnitedHealth's algorithm is based on inaccurate and incomplete data. This raises the concern that the algorithm may be making decisions that are not based on sound medical evidence. Second, the plaintiffs allege that UnitedHealth has failed to adequately train its employees on how to use the algorithm. This raises the concern that employees may be making decisions that are not in the best interests of patients, either because they do not understand how the algorithm works or because they are pressured to deny claims.

The lawsuit also raises the general question of whether algorithms should be used to make healthcare decisions. Some argue that algorithms can be used to make more efficient and objective decisions than humans can. Others argue that algorithms are not capable of making complex medical decisions that require an understanding of the individual patient's circumstances.

The use of algorithms in healthcare is a complex issue with no easy answers. It is important to carefully consider the potential benefits and risks of using algorithms before implementing them in healthcare settings.