My 1999 book The Age of Spiritual Machines generated several lines of criticism, such as Moore’s law will come to an end, hardware capability may be expanding exponentially but software is stuck in the mud, the brain is too complicated, there are capabilities in the brain that inherently cannot be replicated in software, and several others. I specifically wrote The Singularity is Near to respond to those critiques.
Many of the critics of The Singularity is Near fail to respond to the actual arguments I make in the book, but instead choose to mischaracterize my thesis and then attack the mischaracterization. In his essay “Why the Singularity Cannot Happen,” Theodore Modis takes this one step further by borrowing ideas from my book to criticize the straw man thesis that he incorrectly attributes to it.
Modis’ argument is summarized in his first few sentences. He writes “One reason [that the concept of a Singularity as described in Ray Kurzweil’s book cannot happen] is that all natural growth processes that follow exponential patterns eventually reveal themselves to be following S-curves thus excluding runaway situations. The remaining growth potential from Kurzweil’s “knee” which could be approximated as the moment when an S-curve begins deviating from the corresponding exponential is a factor of only one order of magnitude greater than the growth already achieved. A second reason is that there is already evidence of a slowdown in some important trends”.
Modis essentially ignores that I make the exact same point about S-curves in almost the same language in my book.
He then goes on to cite the U.S. GDP, Moore’s law, the Microsoft Windows operating system, the number of users of the Internet, car accidents, AIDS cases, population growth, and nuclear power as examples of exponential growth patterns that did not go on forever (as if I claim that every exponential inherently goes on indefinitely).
Let me summarize what it is that I am saying in The Singularity is Near because none of the examples that Modis gives have any relevance to my thesis.
A primary definition of the Singularity is a future time when we will substantially enhance our own intellectual capabilities by merging with the intelligent technology we have created. The Singularity is the result of the Law of Accelerating Returns. The LOAR is the primary thesis of the book, and it states that fundamental measures of price-performance and capacity of information technologies grow exponentially and do so by spanning multiple paradigms. A particular paradigm, such as Moore’s law, will follow an S-curve, but the basic measures of an information technology transcend each specific S-curve by spanning multiple paradigms.
The LOAR certainly does not state that every exponential trend goes on indefinitely. Almost all of the cases that Modis goes on to discuss in great detail have nothing to do with the LOAR or the Singularity. GDP, car accidents, AIDS cases, population and nuclear power are not information technologies and have nothing to do with my thesis.
Modis is implying that my thesis is the following: computers have shown exponential growth; every example of exponential growth must go on forever; and therefore computer capability will continue growing indefinitely. That represents a basic misrepresentation.
Even those cases which have some relevance to information technology are misstated by Modis. The number of users of the Internet is not a basic measure of communications power. The units of a basic measure would be in bits or bits per constant unit of currency, not in numbers of people. Obviously the number of people doing anything is going to saturate.
Nor is the number of transistors on a chip a basic measure. That represents part of one paradigm, but the basic measure of price-performance of computing is calculations per second per constant dollar. This measure has not shown any slow down. This trend has been going on unabated for well over a century and in fact is speeding up, not slowing down. Here is the logarithmic scale graph updated through 2008. Note that a straight line on a logarithmic scale represents exponential growth and the trend has been and continues to be better than exponential.
Open image in new window
The paradigm of Moore’s law is only the vertical region at the right. Engineers were shrinking vacuum tubes in the 1950s to continue the law of accelerating returns (as it pertains to the price-performance of computing). Indeed that paradigm ran out of steam which led to transistors and that led to integrated circuits (and the paradigm of Moore’s law). In the book, which came out in 2005, I describe that the sixth paradigm will be three-dimensional computing and that is indeed now underway. Today, many chips for applications in MEMs, image sensing, and memory utilize three-dimensional stacking technologies, and other approaches are being perfected.
I discuss the relationship of the LOAR and the S-curves of individual paradigms in five different sections of The Singularity is Near
. Here are brief excerpts from these sections:
A specific paradigm (a method or approach to solving a problem; for example, shrinking transistors on an integrated circuit as a way to make more powerful computers) generates exponential growth until its potential is exhausted. When this happens, a paradigm shift occurs, which enables exponential growth to continue.
The life cycle of a paradigm. Each paradigm develops in three stages:
Slow growth (the early phase of exponential growth)
Rapid growth (the late, explosive phase of exponential growth), as seen in the S-Curve figure below.
A leveling off as the particular paradigm matures. The progression of these three stages looks like the letter “S,” stretched to the right. The S-curve illustration shows how an ongoing exponential trend can be composed of a cascade of s-curves…
Open image in new window
Open image in new window
S-curves are typical of biological growth: replication of a system of relatively fixed complexity (such as an organism of a particular species), operating in a competitive niche and struggling for finite local resources. This often occurs, for example, when a species happens upon a new hospitable environment. Its numbers will grow exponentially for a while before leveling off. The overall exponential growth of an evolutionary process (whether molecular, biological, cultural, or technological) supersedes the limits to growth seen in any particular paradigm (a specific S-curve) as a result of the increasing power and efficiency developed in each successive paradigm. The exponential growth of an evolutionary process, therefore, spans multiple S-curves. The most important contemporary example of this phenomenon is the five paradigms of computation…
It is important to distinguish between the S-curve that is characteristic of any specific technological paradigm and the continuing exponential growth that is characteristic of the ongoing evolutionary process within a broad area of technology, such as computation. Specific paradigms, such as Moore’s Law, do ultimately reach levels at which exponential growth is no longer feasible. But the growth of computation supersedes any of its underlying paradigms and is for present purposes an ongoing exponential.
In accordance with the law of accelerating returns, paradigm shift (also called innovation) turns the S-curve of any specific paradigm into a continuing exponential. A new paradigm, such as three-dimensional circuits, takes over when the old paradigm approaches its natural limit, which has already happened at least four times in the history of computation.
Modis acknowledges that I mention S-curves but he states this in a confusing and misleading way. Modis writes, “Kurzweil acknowledges that there are smaller S-curves that saturate early, but argues that they are replaced by other small S-curves thus cascading indefinitely. He does not seem to be aware of the fact that there is a fractal aspect to such cascades and constituent S-curves are bound by envelope S-curves, which themselves saturate with time”.
Modis’ discussion of the “fractal aspect” of paradigms is not relevant to the discussion and does not contradict my conclusions. The reality is that there has already been a cascade of paradigms in computation and in other examples of basic measures of information technology. As a result these measures have followed smooth exponential trajectories for lengthy periods of time. Modis goes on to completely confuse individual paradigms with the ongoing exponential that spans paradigms. He uses Moore’s law as synonymous with the LOAR and in any event misstates the measures of Moore’s law. His other examples of exponential growth slowing down such as car accidents and AIDS cases are completely irrelevant to the discussion.
I discuss in the book ultimate limits to the ongoing exponential growth of the price-performance of computation based on the physics of computation (that is, the amount of matter and energy required to compute, remember, or transmit a bit of information). Based on this analysis we have trillions fold improvement yet to go. This is not unprecedented as we have indeed already made trillions fold improvement since the advent of computing, a several billion fold improvement just since I was an undergraduate.
There is one other misstatement in Modis’ essay that I will comment on. He writes “Intelligence according to the singularitarians is measured by the speed of calculation.” This ignores about a hundred pages of the book where I talk about improvements in software independently of improvements in hardware. I clearly discuss hardware speed and memory capacity as a necessary but not sufficient conditions for achieving human-level intelligence (and beyond) in a machine.
In The Singularity is Near, I address this issue at length, citing different methods of measuring complexity and capability in software that demonstrate a similar exponential growth. One recent study (“Report to the President and Congress, Designing a Digital Future: Federally Funded Research and Development in Networking and Information Technology” by the President’s Council of Advisors on Science and Technology) states the following: “Even more remarkable—and even less widely understood—is that in many areas, performance gains due to improvements in algorithms have vastly exceeded even the dramatic performance gains due to increased processor speed. The algorithms that we use today for speech recognition, for natural language translation, for chess playing, for logistics planning, have evolved remarkably in the past decade… Here is just one example, provided by Professor Martin Grötschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin. Grötschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later—in 2003—this same model could be solved in roughly 1 min, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Grötschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008”. I cite many other examples like this in the book.
My primary thesis which I discuss in one of the major chapters of the book is that we are making exponential gains in reverse-engineering the methods of the human brain and using these as biologically inspired paradigms to create intelligent machines. Modis makes no mention of these arguments. Instead he misrepresents my position as stating that computational speed alone is sufficient to achieve human-level intelligence. I update discussion of our progress in understanding human intelligence in a book that will be published by Viking in October 2012 titled How to Create a Mind, The Secret of Human Thought Revealed.