Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Friday, March 3, 2023

The Petrov dilemma: Moral responsibility in the age of ChatGPT

Julian Savulescu
NUS News
Originally published 13 FEB 23

Here is an excerpt:

3. Act ethically

What is to act ethically has been debated for thousands of years. But at its heart is a simple answer: add value to the lives of other people. For Lt-Col Petrov, doing nothing was the hardest choice: “All I had to do was to reach for the phone; to raise the direct line to our top commanders – but I couldn’t move. I felt like I was sitting on a hot frying pan.”

ChatGPT is an extremely powerful technology with enormous potential for human benefit. It can distill human knowledge and wisdom, quickly create factual and normative content, enable communication between speakers of different languages, and ultimately enhance or even surpass human creativity.

As with any powerful technology, there are significant risks. At present, there are numerous mistakes and false claims, but these presumably will be ironed out. Longer term, risks include to threats to privacy; large-scale perpetuation of biases; plagiarism; displacement of human workers and creativity; deception and malicious use.

Acting ethically requires realising the value of technology while ensuring the risks are reasonable. Institutional or structural changes may be necessary, for example, labelling and creation of watermarks to prevent plagiarism.

To be human is to make your own ethical decisions

One ethical question that confronts regulators and industry is: should a driverless car swerve and sacrifice an innocent bystander (pedestrian or cyclist) to save a greater number of lives in the vehicle? I asked ChatGPT whether it is right to kill one to save five. Here is the answer:

“Whether or not it is right to kill one person to save five people is a complex ethical question. Some people may argue that taking one life to save five is the more logical option, and is therefore justified. However, it is also important to consider the ethical principle of respect for autonomy.

“This principle states that all individuals have an inherent right to make autonomous and informed decisions about their lives, including their own death. Taking this into consideration, it may not be morally right to force someone to accept the sacrifice of their life in order to save the lives of five other people. Ultimately, the answer to this question is subjective and depends on one’s own ethical and moral beliefs.”

But as driverless cars are rolled out, we won’t have the luxury of remaining on the fence, or relying on drivers’ reflexes to get us out of a decision. We will have to make moral choices and give reasons, and apply them consistently to programming the vehicles. This requires deep – and human – ethical thinking and expertise.