Abstract
In “The rise of the robots and the crisis of moral patiency”, John Danaher argues that the rise of AI and robots will dramatically suppress our moral agency and encourage the expression of moral passivity. This discussion note argues that Danaher needs to strengthen his argument by supporting two key assumptions, that (a) AI will otherwise be friendly or neutral (instead of simply destroying humans), and that (b) humans will largely succumb to the temptation of over-relying upon AI for motivation and decision-making in their personal lives.
Similar content being viewed by others
Notes
In Danaher’s earlier Automation and Utopia (2019b), in which he also writes about the crisis of human passivity, he might have provided some theoretical resources that could be used to argue that humans will by and large succumb, despite my objections. If so, it would have been helpful for Danaher to deploy them in his article and clearly spell out how he is arguing that humans will succumb to the temptation.
References
Bostrom N (2014) Superintelligence: paths, dangers, strategies. OUP, Oxford
Boyles RJM, Joaquin JJ (2019) Why friendly AIs won’t be that friendly: a friendly reply to Muehlhauser and Bostrom. AI Soc. https://doi.org/10.1007/s00146-019-00903-0
Danaher J (2019a) The rise of the robots and the crisis of moral patiency. AI Soc 34(1):129–136
Danaher J (2019b) Automation and utopia: human flourishing in a world without work. HUP, Cambridge
Muehlhauser L, Bostrom N (2014) Why we need friendly AI. Think 13(36):41–47
Nozick R (1974) Anarchy, state, and utopia (Vol. 5038). Basic Books, New York
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Chan, B. The rise of artificial intelligence and the crisis of moral passivity. AI & Soc 35, 991–993 (2020). https://doi.org/10.1007/s00146-020-00953-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00146-020-00953-9