Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Thursday, March 10, 2016

Robots could learn human values by reading stories, research suggests

By Alison Flood
The Guardian
Originally published February 18, 2016

Here is an excerpt:

The system was named Quixote, said Riedl, after Cervantes’ would-be knight-errant, who “reads stories about chivalrous knights and decides to emulate the behaviour of those knights”. The researchers’ paper sees them argue that “stories are necessarily reflections of the culture and society that they were produced in”, and that they “encode many types of sociocultural knowledge: commonly shared knowledge, social protocols, examples of proper and improper behaviour, and strategies for coping with adversity”.

“We believe that a computer that can read and understand stories, can, if given enough example stories from a given culture, ‘reverse engineer’ the values tacitly held by the culture that produced them,” they write. “These values can be complete enough that they can align the values of an intelligent entity with humanity. In short, we hypothesise that an intelligent entity can learn what it means to be human by immersing itself in the stories it produces.”

Riedl said that, “In theory, a collected works of a society could be fed into an AI and the values extracted from the stories would become part of its goals, which is equivalent to writing down all the ‘rules’ of society.”

The researchers see the Quixote technique as best for robots with a limited purpose that need to interact with humanity.

The article is here.