Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy
Showing posts with label Terms of Service. Show all posts
Showing posts with label Terms of Service. Show all posts

Tuesday, January 21, 2020

How Could Commercial Terms of Use and Privacy Policies Undermine Informed Consent in the Age of Mobile Health?

AMA J Ethics. 2018;20(9):E864-872.
doi: 10.1001/amajethics.2018.864.

Abstract

Granular personal data generated by mobile health (mHealth) technologies coupled with the complexity of mHealth systems creates risks to privacy that are difficult to foresee, understand, and communicate, especially for purposes of informed consent. Moreover, commercial terms of use, to which users are almost always required to agree, depart significantly from standards of informed consent. As data use scandals increasingly surface in the news, the field of mHealth must advocate for user-centered privacy and informed consent practices that motivate patients’ and research participants’ trust. We review the challenges and relevance of informed consent and discuss opportunities for creating new standards for user-centered informed consent processes in the age of mHealth.

The info is here.

Thursday, May 16, 2019

It’s Our ‘Moral Responsibility’ to Give The FBI Access to Your DNA

Jennings Brown
www.gizmodo.com
Originally published April 3, 2019

A popular DNA-testing company seems to be targeting true crime fans with a new pitch to let them share their genetic information with law enforcement so cops can catch violent criminals.

Two months ago, FamilyTreeDNA raised privacy concerns after BuzzFeed revealed the company had partnered with the FBI and given the agency access to the genealogy database. Law enforcement’s use of DNA databases has been widely known since last April when California officials revealed genealogy website information was instrumental in determining the identity of the Golden State Killer. But in that case, detectives used publicly shared raw genetic data on GEDmatch. The recent news about FamilyTreeDNA marked the first known time a home DNA test company had willingly shared private genetic information with law enforcement.

Several weeks later, FamilyTreeDNA changed their rules to allow customers to block the FBI from accessing their information. “Users now have the ability to opt out of matching with DNA relatives whose accounts are flagged as being created to identify the remains of a deceased individual or a perpetrator of a homicide or sexual assault,” the company said in a statement at the time.

But now the company seems to be embracing this partnership with law enforcement with their new campaign called, “Families Want Answers.”

The info is here.

Wednesday, January 23, 2019

New tech doorbells can record video, and that's an ethics problem

Molly Wood
www.marketplace.org
Originally posted January 17, 2019

Here is an excerpt:

Ring is pretty clear in its terms and conditions that people are allowing Ring employees to access videos, not live streams, but cached videos. And that's in order to train that artificial intelligence to be better at recognizing neighbors, because they're trying to roll out a feature where they use facial recognition to match with the people that are considered safe. So if I have the Ring cameras, I can say, "All these are safe people. Here's pictures of my kids, my neighbors. If it's not one of these people, consider them unsafe." So that's a new technology. They need to be able to train their algorithms to recognize who's a person, what's a car, what's a cat. Some subset of the videos that are being uploaded just for typical usage are then being shared with their research team in the Ukraine.

The info is here.

Wednesday, September 12, 2018

How Could Commercial Terms of Use and Privacy Policies Undermine Informed Consent in the Age of Mobile Health?

Cynthia E. Schairer, Caryn Kseniya Rubanovich, and Cinnamon S. Bloss
AMA J Ethics. 2018;20(9):E864-872.

Abstract

Granular personal data generated by mobile health (mHealth) technologies coupled with the complexity of mHealth systems creates risks to privacy that are difficult to foresee, understand, and communicate, especially for purposes of informed consent. Moreover, commercial terms of use, to which users are almost always required to agree, depart significantly from standards of informed consent. As data use scandals increasingly surface in the news, the field of mHealth must advocate for user-centered privacy and informed consent practices that motivate patients’ and research participants’ trust. We review the challenges and relevance of informed consent and discuss opportunities for creating new standards for user-centered informed consent processes in the age of mHealth.

The info is here.

Friday, September 7, 2018

23andMe's Pharma Deals Have Been the Plan All Along

Megan Molteni
www.wired.com
Originally posted August 3, 2018

Here is an excerpt:

So last week’s announcement that one of the world’s biggest drugmakers, GlaxoSmithKline, is gaining exclusive rights to mine 23andMe’s customer data for drug targets should come as no surprise. (Neither should GSK’s $300 million investment in the company). 23andMe has been sharing insights gleaned from consented customer data with GSK and at least six other pharmaceutical and biotechnology firms for the past three and a half years. And offering access to customer information in the service of science has been 23andMe’s business plan all along, as WIRED noted when it first began covering the company more than a decade ago.

But some customers were still surprised and angry, unaware of what they had already signed (and spat) away. GSK will receive the same kind of data pharma partners have generally received—summary level statistics that 23andMe scientists gather from analyses on de-identified, aggregate customer information—though it will have four years of exclusive rights to run analyses to discover new drug targets. Supporting this kind of translational work is why some customers signed up in the first place. But it’s clear the days of blind trust in the optimistic altruism of technology companies are coming to a close.

“I think we’re just operating now in a much more untrusting environment,” says Megan Allyse, a health policy researcher at the Mayo Clinic who studies emerging genetic technologies. “It’s no longer enough for companies to promise to make people healthy through the power of big data.”

The info is here.

Friday, August 31, 2018

What you may not know about online therapy companies

Pauline Wallin
The Practice Institute
Originally posted August 19, 2018

Here is an excerpt:

In summary, while platforms such as Talkspace and BetterHelp provide you with ready access to working with clients online, they also limit your control over your relationships with your clients and in how you work with them.

Before signing on with such platforms, read the terms of service thoroughly. Search online for lawsuits against the company you're considering working with, and read reviews that are not on the company's website.

Also, talk with the risk management consultant provided by your malpractice insurer, who can alert you to legal or ethical liabilities. For your maximum legal protection, hire an attorney who specializes in mental health services to review the contract that you will be signing. The contract will most likely be geared to protecting the company, not your or your license.

The info is here.

Friday, March 23, 2018

Facebook Woes: Data Breach, Securities Fraud, or Something Else?

Matt Levine
Bloomberg.com
Originally posted March 21, 2018

Here is an excerpt:

But the result is always "securities fraud," whatever the nature of the underlying input. An undisclosed data breach is securities fraud, but an undisclosed sexual-harassment problem or chicken-mispricing conspiracy will get you to the same place. There is an important practical benefit to a legal regime that works like this: It makes it easy to punish bad behavior, at least by public companies, because every sort of bad behavior is also securities fraud. You don't have to prove that the underlying chicken-mispricing conspiracy was illegal, or that the data breach was due to bad security procedures. All you have to prove is that it happened, and it wasn't disclosed, and the stock went down when it was. The evaluation of the badness is in a sense outsourced to the market: We know that the behavior was illegal, not because there was a clear law against it, but because the stock went down. Securities law is an all-purpose tool for punishing corporate badness, a one-size-fits-all approach that makes all badness commensurable using the metric of stock price. It has a certain efficiency.

On the other hand it sometimes makes me a little uneasy that so much of our law ends up working this way. "In a world of dysfunctional government and pervasive financial capitalism," I once wrote, "more and more of our politics is contested in the form of securities regulation." And: "Our government's duty to its citizens is mediated by their ownership of our public companies." When you punish bad stuff because it is bad for shareholders, you are making a certain judgment about what sort of stuff is bad and who is entitled to be protected from it.

Anyway Facebook Inc. wants to make it very clear that it did not suffer a data breach. When a researcher got data about millions of Facebook users without those users' explicit permission, and when the researcher turned that data over to Cambridge Analytica for political targeting in violation of Facebook's terms, none of that was a data breach. Facebook wasn't hacked. What happened was somewhere between a contractual violation and ... you know ... just how Facebook works? There is some splitting of hairs over this, and you can understand why -- consider that SEC guidance about when companies have to disclose data breaches -- but in another sense it just doesn't matter. You don't need to know whether the thing was a "data breach" to know how bad it was. You can just look at the stock price. The stock went down...

The article is here.