Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Confidentiality. Show all posts
Showing posts with label Confidentiality. Show all posts

Saturday, June 10, 2023

Generative AI entails a credit–blame asymmetry

Porsdam Mann, S., Earp, B. et al. (2023).
Nature Machine Intelligence.

The recent releases of large-scale language models (LLMs), including OpenAI’s ChatGPT and GPT-4, Meta’s LLaMA, and Google’s Bard have garnered substantial global attention, leading to calls for urgent community discussion of the ethical issues involved. LLMs generate text by representing and predicting statistical properties of language. Optimized for statistical patterns and linguistic form rather than for
truth or reliability, these models cannot assess the quality of the information they use.

Recent work has highlighted ethical risks that are associated with LLMs, including biases that arise from training data; environmental and socioeconomic impacts; privacy and confidentiality risks; the perpetuation of stereotypes; and the potential for deliberate or accidental misuse. We focus on a distinct set of ethical questions concerning moral responsibility—specifically blame and credit—for LLM-generated
content. We argue that different responsibility standards apply to positive and negative uses (or outputs) of LLMs and offer preliminary recommendations. These include: calls for updated guidance from policymakers that reflect this asymmetry in responsibility standards; transparency norms; technology goals; and the establishment of interactive forums for participatory debate on LLMs.‌

(cut)

Credit–blame asymmetry may lead to achievement gaps

Since the Industrial Revolution, automating technologies have made workers redundant in many industries, particularly in agriculture and manufacturing. The recent assumption25 has been that creatives
and knowledge workers would remain much less impacted by these changes in the near-to-mid-term future. Advances in LLMs challenge this premise.

How these trends will impact human workforces is a key but unresolved question. The spread of AI-based applications and tools such as LLMs will not necessarily replace human workers; it may simply
shift them to tasks that complement the functions of the AI. This may decrease opportunities for human beings to distinguish themselves or excel in workplace settings. Their future tasks may involve supervising or maintaining LLMs that produce the sorts of outputs (for example, text or recommendations) that skilled human beings were previously producing and for which they were receiving credit. Consequently, work in a world relying on LLMs might often involve ‘achievement gaps’ for human beings: good, useful outcomes will be produced, but many of them will not be achievements for which human workers and professionals can claim credit.

This may result in an odd state of affairs. If responsibility for positive and negative outcomes produced by LLMs is asymmetrical as we have suggested, humans may be justifiably held responsible for negative outcomes created, or allowed to happen, when they or their organizations make use of LLMs. At the same time, they may deserve less credit for AI-generated positive outcomes, as they may not be displaying the skills and talents needed to produce text, exerting judgment to make a recommendation, or generating other creative outputs.

Thursday, December 15, 2022

Dozens of telehealth startups sent sensitive health information to big tech companies

Katie Palmer with
Todd Feathers & Simon Fondrie-Teitler 
STAT NEWS
Originally posted 13 DEC 22

Here is an excerpt:

Health privacy experts and former regulators said sharing such sensitive medical information with the world’s largest advertising platforms threatens patient privacy and trust and could run afoul of unfair business practices laws. They also emphasized that privacy regulations like the Health Insurance Portability and Accountability Act (HIPAA) were not built for telehealth. That leaves “ethical and moral gray areas” that allow for the legal sharing of health-related data, said Andrew Mahler, a former investigator at the U.S. Department of Health and Human Services’ Office for Civil Rights.

“I thought I was at this point hard to shock,” said Ari Friedman, an emergency medicine physician at the University of Pennsylvania who researches digital health privacy. “And I find this particularly shocking.”

In October and November, STAT and The Markup signed up for accounts and completed onboarding forms on 50 telehealth sites using a fictional identity with dummy email and social media accounts. To determine what data was being shared by the telehealth sites as users completed their forms, reporters examined the network traffic between trackers using Chrome DevTools, a tool built into Google’s Chrome browser.

On Workit’s site, for example, STAT and The Markup found that a piece of code Meta calls a pixel sent responses about self-harm, drug and alcohol use, and personal information — including first name, email address, and phone number — to Facebook.

The investigation found trackers collecting information on websites that sell everything from addiction treatments and antidepressants to pills for weight loss and migraines. Despite efforts to trace the data using the tech companies’ own transparency tools, STAT and The Markup couldn’t independently confirm how or whether Meta and the other tech companies used the data they collected.

After STAT and The Markup shared detailed findings with all 50 companies, Workit said it had changed its use of trackers. When reporters tested the website again on Dec. 7, they found no evidence of tech platform trackers during the company’s intake or checkout process.

“Workit Health takes the privacy of our members seriously,” Kali Lux, a spokesperson for the company, wrote in an email. “Out of an abundance of caution, we elected to adjust the usage of a number of pixels for now as we continue to evaluate the issue.”

Friday, November 18, 2022

When Patients Become Colleagues

Charles C. Dike
Psychiatric News
Published Online:27 Oct 2022

Dr. Jones, a psychiatrist in private practice, described to me a conundrum she was trying to resolve. A patient she has been treating for eight years with psychotherapy and medication was recently certified as a therapist. The patient intends to terminate treatment with her and set up a private practice in the same district as the psychiatrist. The new therapist is asking for a collaborative relationship with the psychiatrist in which he would refer patients to the psychiatrist for medication management. The psychiatrist is not comfortable with the proposal and worries that her deep knowledge of her ex-patient’s flaws would negatively influence her view of the patient as a therapist. Most importantly, however, she is concerned about the risks of boundary violations and a breach in confidentiality, for example, when patients ask about the relationship between the psychiatrist and their referring therapist, as often happens.

The APA Ethics Committee has received questions about similar situations. One such question involved a patient who had received psychiatric treatment at an institution for years and was now applying to work as a clinician at the same institution a decade later. In this case, the Ethics Committee affirmed the need for psychiatrists “to support the concept that treatment matters and that people can recover and live full lives by addressing the challenges of mental illness. Psychiatrists should model that seeking treatment is a healthful and positive behavior and not a stigmatized act that will forever preclude a person, once a patient, from joining a team of respected mental health professionals. A history of mental health treatment should not be used to ban employment; a history of appropriate qualifications and pursuit of necessary medical treatment should be positive indicators for employment.”

Nonetheless, every such situation requires deep reflection to avoid potential ethics breaches. In some cases, the guidance is clear. For example, it is unethical for a psychiatrist in a solo private practice to employ a former patient because the pre-existing doctor-patient relationship is likely to influence the working relationship on both sides with potential negative consequences. In Dr. Jones’s case, however, the situation has ethics considerations that need to be addressed. Here is the advice that I gave to Dr. Jones: After celebrating her patient’s success, she should schedule a private meeting to discuss the contours of their new professional relationship. She should clarify that it would be a challenge to be his psychiatrist in the future should he suffer a relapse and need care. Further, Dr. Jones should point out that a personal relationship with a former patient could be unethical, especially if intimate, and therefore, all social interactions should be avoided as much as possible. When it is not possible to avoid them, they should carefully manage their interactions, social or professional, making sure boundaries are not breached. Dr. Jones should also discuss possible circumstances that could insinuate to others that she and the therapist had a prior treatment relationship as any such acknowledgment on her part would be a breach of her patient’s confidentiality. The fact that her former patient discloses their relationship to others does not absolve the psychiatrist of this ethical injunction. Such a discussion would prevent future problems and set the stage for the next chapter of their relationship.

Friday, April 29, 2022

Navy Deputizes Psychologists to Enforce Drug Rules Even for Those Seeking Mental Health Help

Konstantin Toropin
MilitaryTimes.com
Originally posted 18 APR 22

In the wake of reports that a Navy psychologist played an active role in convicting for drug use a sailor who had reached out for mental health assistance, the service is standing by its policy, which does not provide patients with confidentiality and could mean that seeking help has consequences for service members.

The case highlights a set of military regulations that, in vaguely defined circumstances, requires doctors to inform commanding officers of certain medical details, including drug tests, even if those tests are conducted for legitimate medical reasons necessary for adequate care. Allowing punishment when service members are looking for help could act as a deterrent in a community where mental health is still a taboo topic among many, despite recent leadership attempts to more openly discuss getting assistance.

On April 11, Military.com reported the story of a sailor and his wife who alleged that the sailor's command, the destroyer USS Farragut, was retaliating against him for seeking mental health help.

Jatzael Alvarado Perez went to a military hospital to get help for his mental health struggles. As part of his treatment, he was given a drug test that came back positive for cannabinoids -- the family of drugs associated with marijuana. Perez denies having used any substances, but the test resulted in a referral to the ship's chief corpsman.

Perez's wife, Carli Alvarado, shared documents with Military.com that were evidence in the sailor's subsequent nonjudicial punishment, showing that the Farragut found out about the results because the psychologist emailed the ship's medical staff directly, according to a copy of the email.

"I'm not sure if you've been tracking, but OS2 Alvarado Perez popped positive for cannabis while inpatient," read the email, written to the ship's medical chief. Navy policy prohibits punishment for a positive drug test when administered as part of regular medical care.

The email goes on to describe efforts by the psychologist to assist in obtaining a second test -- one that could be used to punish Perez.

"We are working to get him a command directed urinalysis through [our command] today," it added.

Tuesday, March 8, 2022

"Without Her Consent" Harvard Allegedly Obtained Title IX Complainant’s Outside Psychotherapy Records, Absent Her Permission

Colleen Flaherty
Inside Higher Ed
Originally published 10 FEB 22

Here are two excerpts:

Harvard provided background information about how its dispute resolution office works, saying that it doesn’t contact a party’s medical care provider except when a party has indicated that the provider has relevant information that the party wants the office to consider. In that case, the office receives information from the care provider only with the party’s consent.

Multiple legal experts said Wednesday that this is the established protocol across higher education.

Asked for more details about what happened, Kilburn’s lawyer, Carolin Guentert, said that Kilburn’s therapist is a private provider unaffiliated with Harvard, and “we understand that ODR contacted Ms. Kilburn’s therapist and obtained the psychotherapy notes from her sessions with Ms. Kilburn, without first seeking Ms. Kilburn’s written consent as required under HIPAA,” the Health Insurance Portability and Accountability Act of 1996, which governs patient privacy.

Asked if Kilburn ever signed a privacy waiver with her therapist that would have granted the university access to her records, Guentert said Kilburn “has no recollection of signing such a waiver, nor has Harvard provided one to us.”

(cut)

Even more seriously, these experts said that Harvard would have had no right to obtain Kilburn’s mental health records from a third-party provider without her consent.

Andra J. Hutchins, a Massachusetts-based attorney who specializes in education law, said that therapy records are protected by psychotherapist-patient privilege (something akin to attorney-client privilege).

“Unless the school has an agreement with and a release from the student to provide access to those records or speak to the student’s therapist—which can be the case if a student is placed on involuntary leave due to a mental health issue—there should be no reason that a school would be able to obtain a student’s psychotherapy records,” she said.

As far as investigations under Title IX (the federal law against gender-based discrimination in education) go, questions from the investigator seeking information about the student’s psychological records aren’t permitted unless the student has given written consent, Hutchins added. “Schools have to follow state and federal health-care privacy laws throughout the Title IX process. I can’t speculate as to how or why these records were released.”

Daniel Carter, president of Safety Advisors for Educational Campuses, said that “it is absolutely illegal and improper for an institution of higher education to obtain one of their students’ private therapy records from a third party. There’s no circumstance under which that is permissible without their consent.”

Thursday, July 2, 2020

Professional Psychology: Collection Agencies, Confidentiality, Records, Treatment, and Staff Supervision in New Jersey

SUPERIOR COURT OF NEW JERSEY
APPELLATE DIVISION
DOCKET NO. A-4975-17T3

In the Matter of the Suspension or Revocation of the License of L. Barry Helfmann, Psy.D.

Here are two excerpts:

The complaint included five counts. It alleged Dr. Helfmann failed to do the following: take reasonable measures to protect confidentiality of the Partnership's patients' private health information; maintain permanent records that accurately reflected patient contact for treatment purposes; maintain records of professional quality; timely release records requested by a patient; and properly instruct and supervise temporary staff concerning patient confidentiality and record maintenance. The Attorney General sought sanctions under the UEA.

(cut)

The regulation is clear. The doctor's argument to the contrary, that a psychologist could somehow confuse his collection attorney with a patient's authorized representative, is refuted by the regulation's plain language as well as consideration of its entire context. The doctor's argument is without sufficient merit to warrant further discussion. R. 2:11-3(e)(1)(E).

We find nothing arbitrary about the Board's rejection of Dr. Helfmann's argument that he violated no rule or regulation because he relied on the advice of counsel in providing the Partnership's collection attorney with patients' confidential information. His assertion is contrary to the sworn testimony of the collection attorney who was deposed, as distinguished from another collection attorney with whom the doctor spoke in the distant past. The latter attorney's purported statement that confidential information might be necessary to resolve a patient's outstanding fee does not consider, let alone resolve, the propriety of a psychologist releasing such information in the face of clear statutory and regulatory prohibitions.

The Board found that Dr. Helfmann, not his collection attorneys, was charged with the professional responsibility of preserving his patients' confidential information. Perhaps the doctor's argument that he relied on the advice of counsel would have had greater appeal had he asked for a legal opinion on providing confidential patient information to collection attorneys in view of the psychologist-patient privilege and a specific regulatory prohibition against doing so absent a statutory or traditional exception. That the Board found unpersuasive Dr. Helfmann's hearsay testimony about what attorneys told him years ago is hardly arbitrary and capricious, considering the Partnership's current collection attorney's testimony and Dr. Helfmann's statutory and regulatory obligations to preserve confidentiality.

The decision is here.

Wednesday, July 1, 2020

Unusual Legal Case: Small Social Circles, Boundaries, and Harm

This legal case shows how much our social circles interrelate and how easily boundaries can be violated.  If you ever believe that you are safe from boundary violations in a current, complex culture, you may want to rethink this position.  A lesson for all in this legal case.  I will excerpt a fascinating portion of this case.

Roetzel and Andres
jdsupra.com
Originally posted 10 June 20

Possible Employer Vicarious Liability For Employee’s HIPAA Violation Even When Employee Engages In Unauthorized Act

Here is the excerpt:

When the plaintiff came in for her appointment, she handed the Parkview employee a filled-out patient information sheet. The employee then spent about one-minute inputting that information onto Parkview’s electronic health record. The employee recognized the plaintiff’s name as someone who had liked a photo of the employee’s husband on his Facebook account. Suspecting that the plaintiff might have had, or was then having, an affair with her husband, the employee sent some texts to her husband relating to the fact the plaintiff was a Parkview patient. Her texts included information from the patient chart that the employee had created from the patient’s information sheet, such as the patient’s name, her position as a dispatcher, and the underlying reasons for the plaintiff’s visit to the OB/Gyn. Even though such information was not included on the chart, the employee also texted that the plaintiff was HIV-positive and had had more than fifty sexual partners. While using the husband’s phone, the husband’s sister saw the texts. The sister then reported the texts to Parkview. Upon receipt of the sister’s report, Parkview initiated an investigation into the employee’s conduct and ultimately terminated the employee. As part of that investigation, Parkview notified the plaintiff of the disclosure of her protected health information.

The info is here.

Monday, April 13, 2020

Lawmakers Push Again for Info on Google Collecting Patient Data

Rob Copeland
Wall Street Journal
Originally published 3 March 20

A bipartisan trio of U.S. senators pushed again for answers on Google’s controversial “Project Nightingale,” saying the search giant evaded requests for details on its far-reaching data tie-up with health giant Ascension.

The senators, in a letter Monday to St. Louis-based Ascension, said they were put off by the lack of substantive disclosure around the effort.

Project Nightingale was revealed in November in a series of Wall Street Journal articles that described Google’s then-secret engagement to collect and crunch the personal health information of millions of patients across 21 states.

Sens. Richard Blumenthal (D., Conn.), Bill Cassidy (R., La.), and Elizabeth Warren (D., Mass.) subsequently wrote to the Alphabet Inc. GOOG +1.35% unit seeking basic information about the program, including the number of patients involved, the data shared and who at Google had access.

The head of Google Health, Dr. David Feinberg, responded with a letter in December that largely stuck to generalities, according to correspondence reviewed by the Journal.

(cut)

Ascension earlier this year fired an employee who had reached out to media, lawmakers and regulators with concerns about Project Nightingale, a person familiar with the matter said. 

The employee, who described himself as a whistleblower, was told by Ascension higher-ups that he had shared information about the initiative that was intended to be secret, the person said.

Nick Ragone, a spokesman for Ascension—one of the U.S.’s largest health-care systems with 2,600 hospitals, doctors’ offices and other facilities—declined to say why the employee in question was fired. 

Thursday, February 20, 2020

Sharing Patient Data Without Exploiting Patients

McCoy MS, Joffe S, Emanuel EJ.
JAMA. Published online January 16, 2020.
doi:10.1001/jama.2019.22354

Here is an excerpt:

The Risks of Data Sharing

When health systems share patient data, the primary risk to patients is the exposure of their personal health information, which can result in a range of harms including embarrassment, stigma, and discrimination. Such exposure is most obvious when health systems fail to remove identifying information before sharing data, as is alleged in the lawsuit against Google and the University of Chicago. But even when shared data are fully deidentified in accordance with the requirements of the Health Insurance Portability and Accountability Act reidentification is possible, especially when patient data are linked with other data sets. Indeed, even new data privacy laws such as Europe's General Data Protection Regulation and California's Consumer Privacy Act do not eliminate reidentification risk.

Companies that acquire patient data also accept risk by investing in research and development that may not result in marketable products. This risk is less ethically concerning, however, than that borne by patients. While companies usually can abandon unpromising ventures, patients’ lack of control over data-sharing arrangements makes them vulnerable to exploitation. Patients lack control, first, because they may have no option other than to seek care in a health system that plans to share their data. Second, even if patients are able to authorize sharing of their data, they are rarely given the information and opportunity to ask questions needed to give meaningful informed consent to future uses of their data.

Thus, for the foreseeable future, data sharing will entail ethically concerning risks to patients whose data are shared. But whether these exchanges are exploitative depends on how much benefit patients receive from data sharing.

The info is here.

Tuesday, September 10, 2019

Physicians Talking With Their Partners About Patients

Morris NP, & Eshel N.
JAMA. Published online August 16, 2019.
doi:10.1001/jama.2019.12293

Maintaining patient privacy is a fundamental responsibility for physicians. However, physicians often share their lives with partners or spouses. A 2018 survey of 15 069 physicians found that 85% were currently married or living with a partner, and when physicians come home from work, their partners might reasonably ask about their day. Physicians are supposed to keep patient information private in almost all circumstances, but are these realistic expectations for physicians and their partners? Might this expectation preclude potential benefits of these conversations?

In many cases, physician disclosure of clinical information to partners may violate patients’ trust. Patient privacy is so integral to the physician role that the Hippocratic oath notes, “And whatsoever I shall see or hear in the course of my profession...if it be what should not be published abroad, I will never divulge, holding such things to be holy secrets.” Whether over routine health care matters, such as blood pressure measurements; or potentially sensitive topics, such as end-of-life decisions, concerns of abuse, or substance use, patients expect their interactions with physicians to be kept in the strictest confidence. No hospital or clinic provides patients with the disclaimer, “Your private health information may be shared over the dinner table.” If a patient learned that his physician shared information about his medical encounters without permission, the patient may be far less likely to trust the physician or participate in ongoing care.

Physicians who share details with their partners about patients may not anticipate the effects of doing so. For instance, a physician’s partner could recognize the patient being discussed, whether from social connections or media coverage. After sharing patient information, physicians lose control of this information, and their partners, who may have less training about medical privacy, could unintentionally reveal sensitive patient information during future conversations.

The info is here.

Thursday, May 16, 2019

It’s Our ‘Moral Responsibility’ to Give The FBI Access to Your DNA

Jennings Brown
www.gizmodo.com
Originally published April 3, 2019

A popular DNA-testing company seems to be targeting true crime fans with a new pitch to let them share their genetic information with law enforcement so cops can catch violent criminals.

Two months ago, FamilyTreeDNA raised privacy concerns after BuzzFeed revealed the company had partnered with the FBI and given the agency access to the genealogy database. Law enforcement’s use of DNA databases has been widely known since last April when California officials revealed genealogy website information was instrumental in determining the identity of the Golden State Killer. But in that case, detectives used publicly shared raw genetic data on GEDmatch. The recent news about FamilyTreeDNA marked the first known time a home DNA test company had willingly shared private genetic information with law enforcement.

Several weeks later, FamilyTreeDNA changed their rules to allow customers to block the FBI from accessing their information. “Users now have the ability to opt out of matching with DNA relatives whose accounts are flagged as being created to identify the remains of a deceased individual or a perpetrator of a homicide or sexual assault,” the company said in a statement at the time.

But now the company seems to be embracing this partnership with law enforcement with their new campaign called, “Families Want Answers.”

The info is here.

Monday, May 6, 2019

Ethical Considerations Regarding Internet Searches for Patient Information.

Charles C. Dike, Philip Candilis, Barbara Kocsis  and others
Psychiatric Services
Published Online:17 Jan 2019

Abstract

In 2010, the American Medical Association developed policies regarding professionalism in the use of social media, but it did not present specific ethical guidelines on targeted Internet searches for information about a patient or the patient’s family members. The American Psychiatric Association (APA) provided some guidance in 2016 through the Opinions of the Ethics Committee, but published opinions are limited. On behalf of the APA Ethics Committee, the authors developed a resource document describing ethical considerations regarding Internet and social media searches for patient information, from which this article has been adapted. Recommendations include the following. Except in emergencies, it is advisable to obtain a patient’s informed consent before performing such a search. The psychiatrist should be aware of his or her motivations for performing a search and should avoid doing so unless it serves the patient’s best interests. Information obtained through such searches should be handled with sensitivity regarding the patient’s privacy. The psychiatrist should consider how the search might influence the clinician-patient relationship. When interpreted with caution, Internet- and social media–based information may be appropriate to consider in forensic evaluations.

The info is here.

Monday, April 15, 2019

Death by a Thousand Clicks: Where Electronic Health Records Went Wrong

Erika Fry and Fred Schulte
Fortune.com
Originally posted on March 18, 2019

Here is an excerpt:

Damning evidence came from a whistleblower claim filed in 2011 against the company. Brendan Delaney, a British cop turned EHR expert, was hired in 2010 by New York City to work on the eCW implementation at Rikers Island, a jail complex that then had more than 100,000 inmates. But soon after he was hired, Delaney noticed scores of troubling problems with the system, which became the basis for his lawsuit. The patient medication lists weren’t reliable; prescribed drugs would not show up, while discontinued drugs would appear as current, according to the complaint. The EHR would sometimes display one patient’s medication profile accompanied by the physician’s note for a different patient, making it easy to misdiagnose or prescribe a drug to the wrong individual. Prescriptions, some 30,000 of them in 2010, lacked proper start and stop dates, introducing the opportunity for under- or overmedication. The eCW system did not reliably track lab results, concluded Delaney, who tallied 1,884 tests for which they had never gotten outcomes.

(cut)

Electronic health records were supposed to do a lot: make medicine safer, bring higher-quality care, empower patients, and yes, even save money. Boosters heralded an age when researchers could harness the big data within to reveal the most effective treatments for disease and sharply reduce medical errors. Patients, in turn, would have truly portable health records, being able to share their medical histories in a flash with doctors and hospitals anywhere in the country—essential when life-and-death decisions are being made in the ER.

But 10 years after President Barack Obama signed a law to accelerate the digitization of medical records—with the federal government, so far, sinking $36 billion into the effort—America has little to show for its investment.

The info is here.

Wednesday, April 10, 2019

FDA Chief Scott Gottlieb Calls for Tighter Regulations on Electronic Health Records

Fred Schulte and Erika Fry
Fortune.com
Originally posted March 21, 2019

Food and Drug Administration Commissioner Scott Gottlieb on Wednesday called for tighter scrutiny of electronic health records systems, which have prompted thousands of reports of patient injuries and other safety problems over the past decade.

“What we really need is a much more tailored approach, so that we have appropriate oversight of EHRs when they’re doing things that could create risk for patients,” Gottlieb said in an interview with Kaiser Health News.

Gottlieb was responding to “Botched Operation,” a report published this week by KHN and Fortune. The investigation found that the federal government has spent more than $36 billion over the past 10 years to switch doctors and hospitals from paper to digital records systems. In that time, thousands of reports of deaths, injuries, and near misses linked to EHRs have piled up in databases—including at least one run by the FDA.

The info is here.

Tuesday, April 2, 2019

Former Patient Coordinator Pleads Guilty to Wrongfully Disclosing Health Information to Cause Harm

Department of Justice
U.S. Attorney’s Office
Western District of Pennsylvania
Originally posted March 6, 2019

A resident of Butler, Pennsylvania, pleaded guilty in federal court to a charge of wrongfully disclosing the health information of another individual, United States Attorney Scott W. Brady announced today.

Linda Sue Kalina, 61, pleaded guilty to one count before United States District Judge Arthur J. Schwab.

In connection with the guilty plea, the court was advised that Linda Sue Kalina worked, from March 7, 2016 through June 23, 2017, as a Patient Information Coordinator with UPMC and its affiliate, Tri Rivers Musculoskeletal Centers (TRMC) in Mars, Pennsylvania, and that during her employment, contrary to the requirements of the Health Insurance Portability and Accountability Act (HIPAA) improperly accessed the individual health information of 111 UPMC patients who had never been provided services at TRMC. Specifically, on August 11, 2017, Kalina unlawfully disclosed personal gynecological health information related to two such patients, with the intent to cause those individuals embarrassment and mental distress.

Judge Schwab scheduled sentencing for June 25, 2019, at 10 a.m. The law provides for a total sentence of 10 years in prison, a fine of $250,000, or both. Under the Federal Sentencing Guidelines, the actual sentence imposed is based upon the seriousness of the offense and the prior criminal history, if any, of the defendant. Kalina remains on bonding pending the sentencing hearing.

Assistant United States Attorney Carolyn J. Bloch is prosecuting this case on behalf of the government.

The Federal Bureau of Investigation conducted the investigation that led to the prosecution of Kalina.

Thursday, January 31, 2019

HHS issues voluntary guidelines amid rise of cyberattacks

Samantha Liss
www.healthcaredive.com
Originally published January 2, 2019

Dive Brief:

  • To combat security threats in the health sector, HHS issued a voluminous report that details ways small, local clinics and large hospital systems alike can reduce their cybersecurity risks. The guidelines are voluntary, so providers will not be required to adopt the practices identified in the report. 
  • The four-volume report is the culmination of work by a task force, convened in May 2017, that worked to identify the five most common threats in the industry and 10 ways to prepare against those threats.  
  • The five most common threats are email phishing attacks, ransomware attacks, loss or theft of equipment or data, accidental or intentional data loss by an insider and attacks against connected medical devices.

Thursday, January 17, 2019

Neuroethics Guiding Principles for the NIH BRAIN Initiative

Henry T. Greely, Christine Grady, Khara M. Ramos, Winston Chiong and others
Journal of Neuroscience 12 December 2018, 38 (50) 10586-10588
DOI: https://doi.org/10.1523/JNEUROSCI.2077-18.2018

Introduction

Neuroscience presents important neuroethical considerations. Human neuroscience demands focused application of the core research ethics guidelines set out in documents such as the Belmont Report. Various mechanisms, including institutional review boards (IRBs), privacy rules, and the Food and Drug Administration, regulate many aspects of neuroscience research and many articles, books, workshops, and conferences address neuroethics. (Farah, 2010; Link; Link). However, responsible neuroscience research requires continual dialogue among neuroscience researchers, ethicists, philosophers, lawyers, and other stakeholders to help assess its ethical, legal, and societal implications. The Neuroethics Working Group of the National Institutes of Health (NIH) Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, a group of experts providing neuroethics input to the NIH BRAIN Initiative Multi-Council Working Group, seeks to promote this dialogue by proposing the following Neuroethics Guiding Principles (Table 1).

Wednesday, January 16, 2019

What Is the Right to Privacy?

Andrei Marmor
(2015) Philosophy & Public Affairs, 43, 1, pp 3-26

The right to privacy is a curious kind of right. Most people think that we have a general right to privacy. But when you look at the kind of issues that lawyers and philosophers label as concerns about privacy, you see widely differing views about the scope of the right and the kind of cases that fall under its purview.1 Consequently, it has become difficult to articulate the underlying interest that the right to privacy is there to protect—so much so that some philosophers have come to doubt that there is any underlying interest protected by it. According to Judith Thomson, for example, privacy is a cluster of derivative rights, some of them derived from rights to own or use your property, others from the right to your person or your right to decide what to do with your body, and so on. Thomson’s position starts from a sound observation, and I will begin by explaining why. The conclusion I will reach, however, is very different. I will argue that there is a general right to privacy grounded in people’s interest in having a reasonable measure of control over the ways in which they can present themselves (and what is theirs) to others. I will strive to show that this underlying interest justifies the right to privacy and explains its proper scope, though the scope of the right might be narrower, and fuzzier in its boundaries, than is commonly understood.

The info is here.

Saturday, December 29, 2018

Woman who inherited fatal illness to sue doctors in groundbreaking case

Robin McKie
The Guardian
Originally published November 25, 2018

Lawyers are bringing a case against a London hospital trust that could trigger major changes to the rules governing patient confidentiality. The case involves a woman who is suing doctors because they failed to tell her about her father’s fatal hereditary disease before she had her own child.

The woman discovered – after giving birth – that her father carried the gene for Huntington’s disease, a degenerative, incurable brain condition. Later she found out she had inherited the gene and that her own daughter, now eight, has a 50% chance of having it.

The woman – who cannot be named for legal reasons – says she would have had an abortion had she known about her father’s condition, and is suing the doctors who failed to tell her about the risks she and her child faced. It is the first case in English law to deal with a relative’s claim over issues of genetic responsibility.

“This could really change the way we do medicine, because it is about the duty that doctors have to share genetic test results with relatives and whether the duty exists in law,” said Anna Middleton, head of society and ethics research at the Wellcome Genome Campus in Cambridge.

The info is here.

Friday, October 19, 2018

Risk Management Considerations When Treating Violent Patients

Kristen Lambert
Psychiatric News
Originally posted September 4, 2018

Here is an excerpt:

When a patient has a history of expressing homicidal ideation or has been violent previously, you should document, in every subsequent session, whether the patient admits or denies homicidal ideation. When the patient expresses homicidal ideation, document what he/she expressed and the steps you did or did not take in response and why. Should an incident occur, your documentation will play an important role in defending your actions.

Despite taking precautions, your patient may still commit a violent act. The following are some strategies that may minimize your risk.

  • Conduct complete timely/thorough risk assessments.
  • Document, including the reasons for taking and not taking certain actions.
  • Understand your state’s law on duty to warn. Be aware of the language in the law on whether you have a mandatory, permissive, or no duty to warn/protect.
  • Understand your state’s laws regarding civil commitment.
  • Understand your state’s laws regarding disclosure of confidential information and when you can do so.
  • Understand your state’s laws regarding discussing firearms ownership and/or possession with patients.
  • If you have questions, consult an attorney or risk management professional.