Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Tuesday, January 7, 2020

Can Artificial Intelligence Increase Our Morality?

Matthew Hutson
psychologytoday.com
Originally posted 9 Dec 19

Here is an excerpt:

For sure, designing technologies to encourage ethical behavior raises the question of which behaviors are ethical. Vallor noted that paternalism can preclude pluralism, but just to play devil’s advocate I raised the argument for pluralism up a level and noted that some people support paternalism. Most in the room were from WEIRD cultures—Western, educated, industrialized, rich, democratic—and so China’s social credit system feels Orwellian, but many in China don’t mind it.

The biggest question in my mind after Vallor’s talk was about the balance between self-cultivation and situation-shaping. Good behavior results from both character and context. To what degree should we focus on helping people develop a moral compass and fortitude, and to what degree should we focus on nudges and social platforms that make morality easy?

The two approaches can also interact in interesting ways. Occasionally extrinsic rewards crowd out intrinsic drives: If you earn points for good deeds, you come to expect them and don’t value goodness for its own sake. Sometimes, however, good deeds perform a self-signaling function, in which you see them as a sign of character. You then perform more good deeds to remain consistent. Induced cooperation might also act as a social scaffolding for bridges of trust that can later stand on their own. It could lead to new setpoints of collective behavior, self-sustaining habits of interaction.

The info is here.

AI Is Not Similar To Human Intelligence. Thinking So Could Be Dangerous

Elizabeth Fernandez
Artificial intelligenceforbes.com
Originally posted 30 Nov 19

Here is an excerpt:

No doubt, these algorithms are powerful, but to think that they “think” and “learn” in the same way as humans would be incorrect, Watson says. There are many differences, and he outlines three.

The first - DNNs are easy to fool. For example, imagine you have a picture of a banana. A neural network successfully classifies it as a banana. But it’s possible to create a generative adversarial network that can fool your DNN. By adding a slight amount of noise or another image besides the banana, your DNN might now think the picture of a banana is a toaster. A human could not be fooled by such a trick. Some argue that this is because DNNs can see things humans can’t, but Watson says, “This disconnect between biological and artificial neural networks suggests that the latter lack some crucial component essential to navigating the real world.”

Secondly, DNNs need an enormous amount of data to learn. An image classification DNN might need to “see” thousands of pictures of zebras to identify a zebra in an image. Give the same test to a toddler, and chances are s/he could identify a zebra, even one that’s partially obscured, by only seeing a picture of a zebra a few times. Humans are great “one-shot learners,” says Watson. Teaching a neural network, on the other hand, might be very difficult, especially in instances where data is hard to come by.

Thirdly, neural nets are “myopic”. They can see the trees, so to speak, but not the forest. For example, a DNN could successfully label a picture of Kim Kardashian as a woman, an entertainer, and a starlet. However, switching the position of her mouth and one of her eyes actually improved the confidence of the DNN’s prediction. The DNN didn’t see anything wrong with that image. Obviously, something is wrong here. Another example - a human can say “that cloud looks like a dog”, whereas a DNN would say that the cloud is a dog.

The info is here.

Monday, January 6, 2020

The Majority Does Not Determine Morality

Michael Brown
Townhall.com
Originally posted 9 Dec 19

Here is an excerpt:

During the time period from 2003 to 2017, support for polygamy in America rose from 7 percent to 17 percent, an even more dramatic shift from a statistical point of view. And it’s up to 18 percent in 2019.

Gallup noted that this “may simply be the result of the broader leftward shift on moral issues Americans have exhibited in recent years. Or, as conservative columnist Ross Douthat notes in his New York Times blog, ‘Polygamy is bobbing forward in social liberalism's wake ...’ To Douthat and other social conservatives, warming attitudes toward polygamy is a logical consequence of changing social norms -- that values underpinning social liberalism offer ‘no compelling grounds for limiting the number of people who might wish to marry.’”

Gallup also observed that, “It is certainly true that moral perceptions have significantly, fundamentally changed on a number of social issues or behaviors since 2001 -- most notably, gay/lesbian relations, having a baby outside of wedlock, sex between unmarried men and women, and divorce.”

Interestingly, Gallup also noted that there were social reasons that help to explain some of this larger leftward shift (including the rise in divorce and changes in laws; another obvious reason is that people have friends and family members who identify as gay or lesbian).

The info is here.

Pa. prison psychologist loses license after 3 ‘preventable and foreseeable’ suicides

Samantha Melamed
inquirer.com
Originally posted 4 Dec 19

Nearly a decade after a 1½-year stretch during which three prisoners at State Correctional Institution Cresson died by suicide and 17 others attempted it, the Pennsylvania Board of Psychology has revoked the license of the psychologist then in charge at the now-shuttered prison in Cambria County and imposed $17,233 in investigation costs.

An order filed Tuesday said the suicides were foreseeable and preventable and castigated the psychologist, James Harrington, for abdicating his ethical responsibility to intervene when mentally ill prisoners were kept in inhumane conditions — including solitary confinement — and were prevented from leaving their cells for treatment.

Harrington still holds an administrative position with the Department of Corrections, with an annual salary of $107,052.

The info is here.

Sunday, January 5, 2020

The Big Change Coming to Just About Every Website on New Year’s Day

Facebook billboard with a hand cursor clicking an X.Aaron Mak
Slate.com
Originally published 30 Dec 19

Starting New Year’s Day, you may notice a small but momentous change to the websites you visit: a button or link, probably at the bottom of the page, reading “Do Not Sell My Personal Information.”

The change is one of many going into effect Jan. 1, 2020, thanks to a sweeping new data privacy law known as the California Consumer Privacy Act. The California law essentially empowers consumers to access the personal data that companies have collected on them, to demand that it be deleted, and to prevent it from being sold to third parties. Since it’s a lot more work to create a separate infrastructure just for California residents to opt out of the data collection industry, these requirements will transform the internet for everyone.

Ahead of the January deadline, tech companies are scrambling to update their privacy policies and figure out how to comply with the complex requirements. The CCPA will only apply to businesses that earn more than $25 million in gross revenue, that collect data on more than 50,000 people, or for which selling consumer data accounts for more than 50 percent of revenue. The companies that meet these qualifications are expected to collectively spend a total of $55 billion upfront to meet the new standards, in addition to $16 billion over the next decade. Major tech firms have already added a number of user features over the past few months in preparation. In early December, Twitter rolled out a privacy center where users can learn more about the company’s approach to the CCPA and navigate to a dashboard for customizing the types of info that the platform is allowed to use for ad targeting. Google has also created a protocol that blocks websites from transmitting data to the company, which users can take advantage of by downloading an opt-out add-on. Facebook, meanwhile, is arguing that it does not need to change anything because it does not technically “sell” personal information. Companies must at least set up a webpage and a toll-free phone number for fielding data requests.

The info is here.

Saturday, January 4, 2020

Robots in Finance Could Wipe Out Some of Its Highest-Paying Jobs

Lananh Nguyen
Bloomberg.com
Originally poste 6 Dec 19

Robots have replaced thousands of routine jobs on Wall Street. Now, they’re coming for higher-ups.

That’s the contention of Marcos Lopez de Prado, a Cornell University professor and the former head of machine learning at AQR Capital Management LLC, who testified in Washington on Friday about the impact of artificial intelligence on capital markets and jobs. The use of algorithms in electronic markets has automated the jobs of tens of thousands of execution traders worldwide, and it’s also displaced people who model prices and risk or build investment portfolios, he said.

“Financial machine learning creates a number of challenges for the 6.14 million people employed in the finance and insurance industry, many of whom will lose their jobs -- not necessarily because they are replaced by machines, but because they are not trained to work alongside algorithms,” Lopez de Prado told the U.S. House Committee on Financial Services.

During the almost two-hour hearing, lawmakers asked experts about racial and gender bias in AI, competition for highly skilled technology workers, and the challenges of regulating increasingly complex, data-driven financial markets.

The info is here.

Friday, January 3, 2020

Robotics researchers have a duty to prevent autonomous weapons

Christoffer Heckman
theconversation.com
Originally posted 4 Dec 19

Here is an excerpt:

As with all technology, the range of future uses for our research is difficult to imagine. It’s even more challenging to forecast given how quickly this field is changing. Take, for example, the ability for a computer to identify objects in an image: in 2010, the state of the art was successful only about half of the time, and it was stuck there for years. Today, though, the best algorithms as shown in published papers are now at 86% accuracy. That advance alone allows autonomous robots to understand what they are seeing through the camera lenses. It also shows the rapid pace of progress over the past decade due to developments in AI.

This kind of improvement is a true milestone from a technical perspective. Whereas in the past manually reviewing troves of video footage would require an incredible number of hours, now such data can be rapidly and accurately parsed by a computer program.

But it also gives rise to an ethical dilemma. In removing humans from the process, the assumptions that underpin the decisions related to privacy and security have been fundamentally altered. For example, the use of cameras in public streets may have raised privacy concerns 15 or 20 years ago, but adding accurate facial recognition technology dramatically alters those privacy implications.

The info is here.

Thursday, January 2, 2020

The Tricky Ethics of Google's Project Nightingale Effort

Cason Schmit
nextgov.com
Originally posted 3 Dec 19

The nation’s second-largest health system, Ascension, has agreed to allow the software behemoth Google access to tens of millions of patient records. The partnership, called Project Nightingale, aims to improve how information is used for patient care. Specifically, Ascension and Google are trying to build tools, including artificial intelligence and machine learning, “to make health records more useful, more accessible and more searchable” for doctors.

Ascension did not announce the partnership: The Wall Street Journal first reported it.

Patients and doctors have raised privacy concerns about the plan. Lack of notice to doctors and consent from patients are the primary concerns.

As a public health lawyer, I study the legal and ethical basis for using data to promote public health. Information can be used to identify health threats, understand how diseases spread and decide how to spend resources. But it’s more complicated than that.

The law deals with what can be done with data; this piece focuses on ethics, which asks what should be done.

Beyond Hippocrates

Big-data projects like this one should always be ethically scrutinized. However, data ethics debates are often narrowly focused on consent issues.

In fact, ethical determinations require balancing different, and sometimes competing, ethical principles. Sometimes it might be ethical to collect and use highly sensitive information without getting an individual’s consent.

The info is here.

Wednesday, January 1, 2020

Companies Are Judged More Harshly For Their Ethical Failures If The CEO Is A Woman

Emily Reynolds
British Psychological Society
Originally published 19 Nov 19

Gender inequality in the business world has been much discussed over the last few years, with a host of mentoring schemes, grants, business books and political activity all aimed at getting women into leadership positions.

But what happens when this goal is achieved? According to new research, unequal gender dynamics still prevail even at the very top. Nicole Votolato Montgomery and Amanda P. Cowen from the University of Virginia found that women CEOs are judged far more harshly than their male counterparts when a business fails ethically. However, when a failure is down to incompetence, they find, women receive less negative backlash.

(cut)

The team suggests that highlighting such traits in female leaders can “reduce the penalties for female-led organisations”. But others argue that women leaders shouldn’t give in to the pressure of adopting typically “male” traits, and that being helpful and community-focused are actually positive things to bring to the board room. Leaning into stereotypes may not be the best way, long-term, to break them — but either way, it’s clear there’s still a way to go for women in business.

The info is here.