Brent Mittelstadt
Oxford Internet Institute
https://ssrn.com/abstract=3391293
Abstract
AI Ethics is now a global topic of discussion in academic and policy circles. At least 63 public-private initiatives have produced statements describing high-level principles, values, and other tenets to guide the ethical development, deployment, and governance of AI. According to recent meta-analyses, AI Ethics has seemingly converged on a set of principles that closely resemble the four classic principles of medical ethics. Despite the initial credibility granted to a principled approach to AI Ethics by the connection to principles in medical ethics, there are reasons to be concerned about its future impact on AI development and governance. Significant differences exist between medicine and AI development that suggest a principled approach in the latter may not enjoy success comparable to the former. Compared to medicine, AI development lacks (1) common aims and fiduciary duties, (2) professional history and norms, (3) proven methods to translate principles into practice, and (4) robust legal and professional accountability mechanisms. These differences suggest we should not yet celebrate consensus around high-level principles that hide deep political and normative disagreement.
The paper is here.
Shift from professional ethics to business ethics
The outputs of many AI Ethics initiatives resemble professional codes of ethics that address design requirements and the behaviours and values of individual professions. The legitimacy of particular applications and their underlying business interests remain largely unquestioned. This approach conveniently steers debate towards the transgressions of unethical individuals, and away from the collective failure of unethical businesses and business models. Developers will always be constrained by the institutions that employ them. To be truly effective, the ethical challenges of AI cannot conceptualised as individual failures. Going forward, AI Ethics must become an ethics of AI businesses as well.