Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Saturday, January 25, 2020

Psychologist Who Waterboarded for C.I.A. to Testify at Guantánamo

Carol Rosenberg
The New York Times
Originally posted 20 Jan 20

Here is an excerpt:

Mr. Mohammed’s co-defendants were subject to violence, sleep deprivation, dietary manipulation and rectal abuse in the prison network from 2002, when the first of them, Ramzi bin al-Shibh was captured, to 2006, when all five were transferred to the prison at Guantánamo Bay. They will also be present in the courtroom.

In the black sites, the defendants were kept in solitary confinement, often nude, at times confined to a cramped box in the fetal position, hung by their wrists in painful positions and slammed head first into walls. Those techniques, approved by George W. Bush administration lawyers, were part of a desperate effort to force them to divulge Al Qaeda’s secrets — like the location of Osama bin Laden and whether there were terrorist sleeper cells deployed to carry out more attacks.

A subsequent internal study by the C.I.A. found proponents inflated the intelligence value of those interrogations.

The psychologists were called by lawyers to testify for one of the defendants, Mr. Mohammed’s nephew, Ammar al-Baluchi. All five defense teams are expected to question them about policy and for graphic details of conditions in the clandestine overseas prisons, including one in Thailand that for a time was run by Gina Haspel, now the C.I.A. director.

Mr. al-Baluchi’s lawyer, James G. Connell III, is spearheading an effort to persuade the judge to exclude from the trial the testimony of F.B.I. agents who questioned the defendants at Guantánamo in 2007. It was just months after their transfer there from years in C.I.A. prisons, and the defense lawyers argue that, although there was no overt violence during the F.B.I. interrogations, the defendants were so thoroughly broken in the black sites that they were powerless to do anything but tell the F.B.I. agents what they wanted to hear.

By law, prosecutors can use voluntary confessions only at the military commissions at Guantánamo.

The info is here.

Friday, January 24, 2020

Psychology accused of ‘collective self-deception’ over results

Image result for psychology as scienceJack Grove
The Times Higher Education
Originally published 10 Dec 19

Here is an excerpt:

If psychologists are serious about doing research that could make “useful real-world predictions”, rather than conducting highly contextualised studies, they should use “much larger and more complex datasets, experimental designs and statistical models”, Dr Yarkoni advises.

He also suggests that the “sweeping claims” made by many papers bear little relation to their results, maintaining that a “huge proportion of the quantitative inferences drawn in the published psychology literature are so inductively weak as to be at best questionable and at worst utterly insensible”.

Many psychologists were indulging in a “collective self-deception” and should start “acknowledging the fundamentally qualitative nature of their work”, he says, stating that “a good deal of what currently passes for empirical psychology is already best understood as insightful qualitative analysis dressed up as shoddy quantitative science”.

That would mean no longer including “scientific-looking inferential statistics” within papers, whose appearance could be considered an “elaborate rhetorical ruse used to mathematicise people into believing claims they would otherwise find logically unsound”.

The info is here.

How One Person Can Change the Conscience of an Organization

Nicholas W. Eyrich, Robert E. Quinn, and
David P. Fessell
Harvard Business Review
Originally published 27 Dec 19

Here is an excerpt:

A single person with a clarity of conscience and a willingness to speak up can make a difference. Contributing to the greater good is a deep and fundamental human need. When a leader, even a mid-level or lower level leader, skillfully brings a voice and a vision, others will follow and surprising things can happen—even culture change on a large scale. While Yamada did not set out to change a culture, his actions were catalytic and galvanized the organization. As news of the new “not for profit” focus of Tres Cantos spread, many of GSK’s top scientists volunteered to work there. Yamada’s voice spoke for many others, offering a clear path and a vision for a more positive future for all.

The info is here.

Thursday, January 23, 2020

Colleges want freshmen to use mental health apps. But are they risking students’ privacy?

 (iStock)Deanna Paul
The New York Times
Originally posted 2 Jan 20

Here are two excepts:

TAO Connect is just one of dozens of mental health apps permeating college campuses in recent years. In addition to increasing the bandwidth of college counseling centers, the apps offer information and resources on mental health issues and wellness. But as student demand for mental health services grows, and more colleges turn to digital platforms, experts say universities must begin to consider their role as stewards of sensitive student information and the consequences of encouraging or mandating these technologies.

The rise in student wellness applications arrives as mental health problems among college students have dramatically increased. Three out of 5 U.S. college students experience overwhelming anxiety, and 2 in 5 students reported debilitating depression, according to a 2018 survey from the American College Health Association.

Even so, only about 15 percent of undergraduates seek help at a university counseling center. These apps have begun to fill students’ needs by providing ongoing access to traditional mental health services without barriers such as counselor availability or stigma.

(cut)

“If someone wants help, they don’t care how they get that help,” said Lynn E. Linde, chief knowledge and learning officer for the American Counseling Association. “They aren’t looking at whether this person is adequately credentialed and are they protecting my rights. They just want help immediately.”

Yet she worried that students may be giving up more information than they realize and about the level of coercion a school can exert by requiring students to accept terms of service they otherwise wouldn’t agree to.

“Millennials understand that with the use of their apps they’re giving up privacy rights. They don’t think to question it,” Linde said.

The info is here.

You Are Already Having Sex With Robots

Henry the sex robotEmma Grey Ellis
wired.com
Originally published 23 Aug 19

Here are two excerpts:

Carnegie Mellon roboticist Hans Moravec has written about emotions as devices for channeling behavior in helpful ways—for example, sexuality prompting procreation. He concluded that artificial intelligences, in seeking to please humanity, are likely to be highly emotional. By this definition, if you encoded an artificial intelligence with the need to please humanity sexually, their urgency to follow their programming constitutes sexual feelings. Feelings as real and valid as our own. Feelings that lead to the thing that feelings, probably, evolved to lead to: sex. One gets the sense that, for some digisexual people, removing the squishiness of the in-between stuff—the jealousy and hurt and betrayal and exploitation—improves their sexual enjoyment. No complications. The robot as ultimate partner. An outcome of evolution.

So the sexbotcalypse will come. It's not scary, it's just weird, and it's being motivated by millennia-old bad habits. Laziness, yes, but also something else. “I don’t see anything that suggests we’re going to buck stereotypes,” says Charles Ess, who studies virtue ethics and social robots at the University of Oslo. “People aren’t doing this out of the goodness of their hearts. They’re doing this to make money.”

(cut)

Technologizing sexual relationships will also fill one of the last blank spots in tech’s knowledge of (ad-targetable) human habits. Brianna Rader—founder of Juicebox, progenitor of Slutbot—has spoken about how difficult it is to do market research on sex. If having sex with robots or other forms of sex tech becomes commonplace, it wouldn’t be difficult anymore. “We have an interesting relationship with privacy in the US,” Kaufman says. “We’re willing to trade a lot of our privacy and information away for pleasures less complicated than an intimate relationship.”

The info is here.

Wednesday, January 22, 2020

Association Between Physician Depressive Symptoms and Medical Errors

Pereira-Lima K, Mata DA, & others
JAMA Netw Open. 2019; 2(11):e1916097

Abstract

Importance  Depression is highly prevalent among physicians and has been associated with increased risk of medical errors. However, questions regarding the magnitude and temporal direction of these associations remain open in recent literature.

Objective  To provide summary relative risk (RR) estimates for the associations between physician depressive symptoms and medical errors.

Conclusions and Relevance  Results of this study suggest that physicians with a positive screening for depressive symptoms are at higher risk for medical errors. Further research is needed to evaluate whether interventions to reduce physician depressive symptoms could play a role in mitigating medical errors and thus improving physician well-being and patient care.

From the Discussion

Studies have recommended the addition of physician well-being to the Triple Aim of enhancing the patient experience of care, improving the health of populations, and reducing the per capita cost of health care. Results of the present study endorse the Quadruple Aim movement by demonstrating not only that medical errors are associated with physician health but also that physician depressive symptoms are associated with subsequent errors. Given that few physicians with depression seek treatment and that recent evidence has pointed to the lack of organizational interventions aimed at reducing physician depressive symptoms, our findings underscore the need for institutional policies to remove barriers to the delivery of evidence-based treatment to physicians with depression.

https://doi.org/10.1001/jamanetworkopen.2019.16097

‘The Algorithm Made Me Do It’: Artificial Intelligence Ethics Is Still On Shaky Ground

Joe McKendrick
Forbes.com
Originally published 22 Dec 19

Here is an excerpt:

Inevitably, “there will be lawsuits that require you to reveal the human decisions behind the design of your AI systems, what ethical and social concerns you took into account, the origins and methods by which you procured your training data, and how well you monitored the results of those systems for traces of bias or discrimination,” warns Mike Walsh, CEO of Tomorrow, and author of The Algorithmic Leader: How to Be Smart When Machines Are Smarter Than You, in a recent Harvard Business Review article. “At the very least trust, the algorithmic processes at the heart of your business. Simply arguing that your AI platform was a black box that no one understood is unlikely to be a successful legal defense in the 21st century. It will be about as convincing as ‘the algorithm made me do it.’”

It’s more than legal considerations that should drive new thinking about AI ethics. It’s about “maintaining trust between organizations and the people they serve, whether clients, partners, employees, or the general public,” a recent report out of Accenture maintains. The report’s authors, Ronald Sandler and John Basl, both with Northeastern University’s philosophy department, and Steven Tiell of Accenture, state that a well-organized data ethics capacity can help organizations manage risks and liabilities associated with such data misuse and negligence.

“It can also help organizations clarify and make actionable mission and organizational values, such as responsibilities to and respect for the people and communities they serve,” Sandler and his co-authors advocate. A data ethics capability also offers organizations “a path to address the transformational power of data-driven AI and machine learning decision-making in an anticipatory way, allowing for proactive responsible development and use that can help organizations shape good governance, rather than inviting strict oversight.”

The info is here.

Tuesday, January 21, 2020

How Could Commercial Terms of Use and Privacy Policies Undermine Informed Consent in the Age of Mobile Health?

AMA J Ethics. 2018;20(9):E864-872.
doi: 10.1001/amajethics.2018.864.

Abstract

Granular personal data generated by mobile health (mHealth) technologies coupled with the complexity of mHealth systems creates risks to privacy that are difficult to foresee, understand, and communicate, especially for purposes of informed consent. Moreover, commercial terms of use, to which users are almost always required to agree, depart significantly from standards of informed consent. As data use scandals increasingly surface in the news, the field of mHealth must advocate for user-centered privacy and informed consent practices that motivate patients’ and research participants’ trust. We review the challenges and relevance of informed consent and discuss opportunities for creating new standards for user-centered informed consent processes in the age of mHealth.

The info is here.

10 Years Ago, DNA Tests Were The Future Of Medicine. Now They’re A Social Network — And A Data Privacy Mess

Peter Aldhaus
buzzfeednews.com
Originally posted 11 Dec 19

Here is an excerpt:

But DNA testing can reveal uncomfortable truths, too. Families have been torn apart by the discovery that the man they call “Dad” is not the biological father of his children. Home DNA tests can also be used to show that a relative is a rapist or a killer.

That possibility burst into the public consciousness in April 2018, with the arrest of Joseph James DeAngelo, alleged to be the Golden State Killer responsible for at least 13 killings and more than 50 rapes in the 1970s and 1980s. DeAngelo was finally tracked down after DNA left at the scene of a 1980 double murder was matched to people in GEDmatch who were the killer's third or fourth cousins. Through months of painstaking work, investigators working with the genealogist Barbara Rae-Venter built family trees that converged on DeAngelo.

Genealogists had long realized that databases like GEDmatch could be used in this way, but had been wary of working with law enforcement — fearing that DNA test customers would object to the idea of cops searching their DNA profiles and rummaging around in their family trees.

But the Golden State Killer’s crimes were so heinous that the anticipated backlash initially failed to materialize. Indeed, a May 2018 survey of more than 1,500 US adults found that 80% backed police using public genealogy databases to solve violent crimes.

“I was very surprised with the Golden State Killer case how positive the reaction was across the board,” CeCe Moore, a genealogist known for her appearances on TV, told BuzzFeed News a couple of months after DeAngelo’s arrest.

The info is here.