Skip to main content

Introduction to the Technological Singularity

  • Chapter
  • First Online:
The Technological Singularity

Part of the book series: The Frontiers Collection ((FRONTCOLL))

  • 1205 Accesses

Summary

This chapter introduces the term technological singularity, and analyses the varying and ambiguous ways it is used. It looks at the difficulty in predicting what would happen with “human comparable” artificial intelligences (AI), and what can nevertheless be said about such an occurence. The track record for predictions in AI by experts is a poor one, for solid theoretical reasons backed up with empirical prediction evidence. However, there are strong arguments implying that such an AI could become extremely powerful, though one or another of various plausible routes—not necessarily requiring the AI to be “superintelligent”, either. Then the chapter demonstrates that such an AI has a non-insignificant chance of doing dangerous for humanity as a whole. The difficulty in reasoning about this subject and the uncertainty surrounding it cannot be seen as excuses to do nothing—indeed a position that AI would be safe is a position of great overconfidence, far beyond what can be warranted by the evidence.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Armstrong, Stuart. Smarter than us: The rise of machine intelligence. MIRI, 2014.

    Google Scholar 

  • Bostrom, Nick. Superintelligence: Paths, dangers, strategies. OUP Oxford, 2014.

    Google Scholar 

  • Cooke, Roger M., and Louis HJ Goossens. “Expert judgement elicitation for risk assessments of critical infrastructures.” Journal of risk research 7.6 (2004): 643–656.

    Google Scholar 

  • Eden, Amnon H., Eric Steinhart, David Pearce, and James H. Moor. “Singularity hypotheses: an overview.” In Singularity Hypotheses, pp. 1–12. Springer Berlin Heidelberg, 2012. Harvard

    Google Scholar 

  • Kahneman, Daniel. Thinking, fast and slow. Macmillan, 2011.

    Google Scholar 

  • Klein, Gary. “The recognition-primed decision (RPD) model: Looking back, looking forward.” Naturalistic decision making (1997): 285–292.

    Google Scholar 

  • Sandberg, Anders. “An overview of models of technological singularity.” Roadmaps to AGI and the Future of AGI Workshop, Lugano, Switzerland, March. Vol. 8. 2010.

    Google Scholar 

  • Shanteau, James, et al. “Performance-based assessment of expertise: How to decide if someone is an expert or not.” European Journal of Operational Research 136.2 (2002): 253–263.

    Google Scholar 

  • Tetlock, Philip. Expert political judgment: How good is it? How can we know?. Princeton University Press, 2005.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stuart Armstrong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer-Verlag GmbH Germany

About this chapter

Cite this chapter

Armstrong, S. (2017). Introduction to the Technological Singularity. In: Callaghan, V., Miller, J., Yampolskiy, R., Armstrong, S. (eds) The Technological Singularity. The Frontiers Collection. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-54033-6_1

Download citation

Publish with us

Policies and ethics