• Luke GoodeEmail author


The singularity is a hypothesis about the future based on the idea that artificial intelligence and related technologies develop at an exponential rather than linear rate, and that we are fast approaching an inevitable tipping point where machine intelligence will outpace human thought, with radical consequences for civilization. This chapter outlines the basis for such claims and shows that this apparently futuristic idea builds on some rather ancient ways of thinking.


  1. Bostrom, Nick. 2014. Superintelligence: Paths, dangers, strategies. Oxford: Oxford University Press.Google Scholar
  2. Cellan-Jones, Rory. 2016. Stephen Hawking – Will AI kill or save humankind? BBC Online, October 20.
  3. Drexler, Eric. 1986. Engines of creation: The coming era of nanotechnology. New York: Anchor Books.Google Scholar
  4. Feeney, Lauren. 2011. Climate change no problem, says futurist Ray Kurzweil. The Guardian, February 21.
  5. Galeon, Dom, and Christianna Reedy. 2017. Kurzweil claims that the singularity will happen by 2045. Futurism, October 5.
  6. Gray, John. 2011. The immortalization commission: Science and the strange quest to cheat death. London: Allen Lane.Google Scholar
  7. Kurzweil, Ray. 2005. The singularity is near: When humans transcend biology. London: Penguin Books.Google Scholar
  8. Leary, Kyree. 2017. Elon Musk outlined the two critical things threatening humanity. Futurism, November 21.
  9. MacLeod, Ken. 1998. The cassini division. New York: Tor Books.Google Scholar
  10. Sci-Fi Science: Physics of the impossible. “A.I. uprising.” Season 2, episode 10. Presented by Michio Kaku. Science Channel, October 27, 2010.Google Scholar

Copyright information

© The Author(s) 2019

Authors and Affiliations

  1. 1.University of AucklandAucklandNew Zealand

Personalised recommendations