How We Might See the End of the Information Age

  • James W. Cortada


Let’s head straight to the core issue—artificial intelligence, computing that does human-like thinking and action, makes decisions, and ultimately is supposed to be smarter than we mortals. AI as it is called in computer circles has been hyped for over 70 years and for the longest time was the most underperforming, disappointing information technology that never met expectations. Today it seems every book publisher has a book out on AI and every computing journal is discussing it. Yet, it had—and still has—among its defenders some of the smartest human beings, and not all of them crowded together into buildings at their home bases at such universities as the Massachusetts Institute of Technology (MIT), Cal Tech, and Carnegie-Mellon; they were (are) also nested in other universities and inside computer manufacturing firms around the world. AI’s high priests occupied a universe wedged somewhere between serious scientific disciplines and science fiction, seemingly weaving back and forth between the two based on how windy the AI hype and discourse was in any particular decade. It seemed in all those years that AI loomed in the background poised almost ready to spring on society. Serious nonfiction books and movies never kept the topic far away from millions of people. But it never pounced, or has it? Hint: Today all manner of firms and government agencies are hiring AI experts; the demand for such skills is borderline insatiable on both sides of the Atlantic, in Russia, and in China and Japan. So, we have to understand what it is, then its implications for people.

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • James W. Cortada
    • 1
  1. 1.Charles Babbage InstituteUniversity of MinnesotaMinneapolisUSA

Personalised recommendations