Advertisement

Singularity Blog Insights

Chapter
Part of the The Frontiers Collection book series (FRONTCOLL)

Summary

This chapter presents four articles from the blogosphere. In the first, Eliezer Yudkowsky provides the three most commonly used meanings of the Singularity: Accelerating Change where the Singularity occurs because of exponential improvements in technology, the Event Horizon where technology will improve to the point where a human-level intelligence can’t predict what will happen, and the Intelligence Explosion in which a self-improving artificial intelligence quickly brings about a singularity. In the next essay, Stuart Armstrong, one of the editors of this book, analyzes many past predictions of AI development. In the third article Scott Siskind discusses reasons why we shouldn’t wait to research AI safety. In the final entry, Scott Aaronson discusses why he does not think that a singularity is near.

Keywords

Event Horizon Turing Machine Eternal Life Core Claim Reward Center 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.Department of EconomicsSmith CollegeNorthamptonUSA

Personalised recommendations