Originally posted June 16, 2017
Here is an excerpt:
The big problem AI faces is not the intelligence part, really. It's the autonomy part. Finally, at the end of the day, even the smartest computers are tools, our tools — and their intentions are our intentions. Or, to the extent that we can speak of their intentions at all — for example of the intention of a self-driving car to avoid an obstacle — we have in mind something it was designed to do.
Even the most primitive organism, in contrast, at least seems to have a kind of autonomy. It really has its own interests. Light. Food. Survival. Life.
The danger of our growing dependence on technologies is not really that we are losing our natural autonomy in quite this sense. Our needs are still our needs. But it is a loss of autonomy, nonetheless. Even auto mechanics these days rely on diagnostic computers and, in the era of self-driving cars, will any of us still know how to drive? Think what would happen if we lost electricity, or if the grid were really and truly hacked? We'd be thrown back into the 19th century, as Dennett says. But in many ways, things would be worse. We'd be thrown back — but without the knowledge and know-how that made it possible for our ancestors to thrive in the olden days.
I don't think this fear is unrealistic. But we need to put it in context.
The article is here.