"Living a fully ethical life involves doing the most good we can." - Peter Singer
"Common sense is not so common." - Voltaire
“There are two ways to be fooled. One is to believe what isn't true; the other is to refuse to believe what is true.” ― Søren Kierkegaard

Thursday, December 3, 2015

We have greater moral obligations to robots than to humans

By Eric Schwitzgebel
Aeon - Opinion
Originally posted November 12, 2015

Here is an excerpt:

I think that, if we someday create robots with human-like cognitive and emotional capacities, we owe them more moral consideration than we would normally owe to otherwise similar human beings.

Here’s why: we will have been their creators and designers. We are thus directly responsible both for their existence and for their happy or unhappy state. If a robot needlessly suffers or fails to reach its developmental potential, it will be in substantial part because of our failure – a failure in our creation, design or nurturance of it. Our moral relation to robots will more closely resemble the relation that parents have to their children, or that gods have to the beings they create, than the relationship between human strangers.

Post a Comment