Peter Montagnon
www.ft.com
Originally posted June 3, 2019
Here is an excerpt:
Ethics are particularly important when technology enters the governance agenda. Machines may be capable of complex calculation but they are so far unable to make qualitative or moral judgments.
Also, the use and manipulation of a massive amount of data creates an information asymmetry. This confers power on those who control it at the potential expense of those who are the subject of it.
Ultimately there must always be human accountability for the decisions that machines originate.
In the corporate world, the board is where accountability resides. No one can escape this. To exercise their responsibilities, directors do not need to be as expert as tech teams. For sure, they need to be familiar with the scope of technology used by their companies, what it can and cannot do, and where the risks and opportunities lie.
For that they may need trustworthy advice from either the chief technology officer or external experts, but the decisions will generally be about what is acceptable and what is not.
The risks may well be of a human rather than a tech kind. With the motor industry, one risk with semi-automated vehicles is that the owners of such cars will think they can do more on autopilot than they can. It seems most of us are bad at reading instructions and will need clear warnings, perhaps to the point where the car may even seem disappointing.
The info is here.