Skip to main content

Cybernetic Tensions: Anatomy of a Collapse

  • Chapter
  • First Online:
The Nature of the Machine and the Collapse of Cybernetics

Abstract

This chapter shows how cybernetic inquiries into the nature of a machine proved fatal to the very fabric of the project, due to unsolvable theoretical tensions. It ties up both Ashby’s and von Neumann’s contributions with the erosion of the main pillars of cybernetics—as presented in Chapter 5. This part recollects and wraps up the philosophical insights emerging out of studying the cybernetic core question: an inquiry into the nature of a machine. In particular, it fleshes out the ontological consequences of asserting the possible immateriality of a machine, as well as the epistemological consequences of establishing an isomorphism between machines and highly complex entities. These amounted to the end of cybernetics.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The Ratio Club (Chapter 3, Section “The Ratio Club”).

  2. 2.

    Ashby 1940.

  3. 3.

    Wiener 1948a.

  4. 4.

    Ashby 1952a.

  5. 5.

    Maxwell 1868.

  6. 6.

    Chapter 3, Section “Norbert Wiener’s Cybernetics”.

  7. 7.

    Ashby 1956, p. 2.

  8. 8.

    Ibid.

  9. 9.

    Ashby 1962, p. 260.

  10. 10.

    Ashby 1956, p. 1. Also, see the introductory remarks to Chapter 6.

  11. 11.

    Ashby 1962, p. 261. This might be, in the view of this writer, the only rigorous definition of a machine ever advanced. History of science does not seem to show any precedent. Alan Turing and Heinrich Hertz might have come the closest, but they still fell short of providing a full-blown definition (see Chapters 4, Section “A Machinal Understanding of an Algorithm and the Material Liberation of the Machine.”; and 6, Section “William Ross Ashby’s Nature-Machine Equalization”, respectively).

  12. 12.

    The quote continues: “Also to be excluded as irrelevant is any reference to energy, for any calculating machine shows that what matters is the regularity of the behavior—whether energy is gained or lost, or even created, is simply irrelevant” (Ashby 1962, p. 260).

  13. 13.

    For Vico’s philosophy see Chapter 9, Section “Viconian Constructability as a Criterion of Truth”.

  14. 14.

    Chapter 2, Section “The Macy Conferences”—regarding the first Macy conference.

  15. 15.

    Wiener 1948a, p. 28.

  16. 16.

    Edwards 1996, p. 197.

  17. 17.

    Chapter 4, Section “A Machinal Understanding of an Algorithm and the Material Liberation of the Machine”.

  18. 18.

    Chapter 6, Section “The Backlash: Unforeseen Consequences of a Behavior-based Ontology”.

  19. 19.

    Ibid.

  20. 20.

    Chapter 5, Section “Machines Can Be Teleological”.

  21. 21.

    In words of Ashby, “a machine can be at the same time (a) strictly determinate in its actions, and (b) yet demonstrate a self-induced change of organization” (Ashby 1947b).

  22. 22.

    Chapter 5, Section “Machines Can Be Teleological”.

  23. 23.

    Ibid.

  24. 24.

    Chapter 5, Section “Machines Can Be Teleological”.

  25. 25.

    Chapter 6, Section “Un-cybernetic DAMS: From Behavior into Structure”.

  26. 26.

    Recalling the relevant quote by Ashby:

    The moment we see that “adaptiveness” implies a circuit and that a circuit implies an equilibrium, we can see at once that this equilibrium must be of the stable type, for any unstable variable destroys itself. And it is precisely the main feature of adaptive behavior that it enables the animal to continue to exist…I must say unambiguously what I mean by “alive”. I assume that if the organism is to stay alive, a comparatively small number of essential variables must be kept between physiologic limits. Each of these variables can be represented by a pointer in a dial…This, of course, is in no way peculiar to living organisms (Ashby 1953, p. 73).

  27. 27.

    The “post hoc ergo propter hoc” fallacy was seemingly not flagged, possibly due to the context of the necessity of an exclusively mechanical explanation—which severely reduces the possibility of entertaining alternative explanations.

  28. 28.

    Chapter 6, Section “Un-cybernetic DAMS: From Behavior into Structure”.

  29. 29.

    Chapter 3, Section “The Decline”.

  30. 30.

    Jeffress 1951.

  31. 31.

    To recall, McCulloch said that

    Neurons are cheap and plentiful. If it cost a million dollars to beget a man, one neuron would not cost a mill. They operate with comparatively little energy. The heat generatedraises the blood in passage about half a degree, and the flow is half a liter per minute, only a quarter of a kilogram calorie per minute for 10 10, that is, 10 billion neurons. Von Neumann would be happy to have their like for the same cost in his robots. His vacuum tubes can work a thousand times as fast as neurons, so he could match a human brain with 10 million tubes; but it would take Niagara Falls to supply the current and the Niagara River to carry away the heat (McCulloch 1951, p. 54).

  32. 32.

    Recalling the relevant quote by von Neumann…

    I think that it is quite likely that one may give a purely descriptive account of the outwardly visible functions of the central nervous system in a humanly possible time. This may be 10 or 20 years—which is long, but not prohibitively long. Then, on the basis of the results of McCulloch and Pitts, one could draw within plausible time limitations a fictitious “nervous network” that can carry out all these functions. I suspect, however, that it will turn out to be much larger than the one that we actually possess. It is possible that it will prove to be too large to fit into the physical universe. What then? Haven’t we lost the true problem in the process? (Von Neumann 1951, p. 34).

  33. 33.

    Von Neumann 2005.

  34. 34.

    Von Neumann 2005, p. 280.

  35. 35.

    Von Neumann 2005, p. 280.

  36. 36.

    To recall, von Neumann candidly confessed that…

    After these devastatingly general and positive results [from the McCulloch and Pitts networks] one is therefore thrown back on microwork and cytology—where one might have remained in the first place…Yet, when we are in that field, the complexity of the subject is overawing (Von Neumann 2005, p. 278).

Bibliography

  • Ashby, W. R. (1962). Principles of the self-organizing system. In H. Von Foerster & G. W. Zopf Jr. (Eds.), Principles of Self-Organization: Transactions of the University of Illinois Symposium (pp. 255–278). London, UK: Pergamon Press.

    Google Scholar 

  • Ashby, W. R. (1940). Adaptiveness and equilibrium. Journal of Mental Science, 86, 478–483.

    Google Scholar 

  • Ashby, W. R. (1947b). Principles of the self-organizing dynamic system. Journal of General Psychology, 37(2), 125–128.

    Article  Google Scholar 

  • Ashby, W. R. (1952a). Design for a Brain. New York, NY: John Wiley & Sons.

    Google Scholar 

  • Ashby, W. R. (1953). Homeostasis. In H. Von Foerster, M. Mead, & H. L. Teuber (Eds.), Cybernetics: Circular Causal and Feedback Mechanisms in Biological and Social Systems Transactions of the Ninth Conference, New York, March 20–21, 1952, (pp. 73–108). New York, NY: Josiah Macy, Jr. Foundation.

    Google Scholar 

  • Ashby, W. R. (1956). An Introduction to Cybernetics. London: UK: Chapman and Hall. New York, NY: John Wiley & Sons.

    Book  Google Scholar 

  • Edwards, P. N. (1996). The machine in the middle: Cybernetic psychology and World War II. In P.N. Edwards (Ed.). The Closed World: Computers and the Politics of Discourse in Cold War America (pp. 174–207). Cambridge, MA: MIT Press.

    Google Scholar 

  • Jeffress, L. A. (Ed.). (1951). Cerebral Mechanisms in Behavior: The Hixon Symposium. New York, NY: John Wiley & Sons.

    Google Scholar 

  • Maxwell, J. C. (1868). On Governors. Proceedings of the Royal Society of London, 16, 270–283.

    Article  Google Scholar 

  • McCulloch, W. S. (1951). Why the mind is in the head [and discussion]. In L. A. Jeffress (Ed.), Cerebral Mechanisms in Behavior: The Hixon Symposium (pp. 42–74). New York, NY: John Wiley & Sons.

    Google Scholar 

  • Von Neumann, J. (1951). The general and logical theory of automata. In L. A. Jeffress (Ed.), Cerebral Mechanisms in Behavior: The Hixon Symposium (pp. 1–41). New York, NY: John Wiley & Sons.

    Google Scholar 

  • Von Neumann, J. (2005). Letter to wiener. In M. Rédei (Ed.), John von Neumann: Selected letters (pp. 277–282). Providence, R.I: American Mathematical Society.

    Google Scholar 

  • Wiener, N. (1948a). Cybernetics: Or Control and Communication in the Animal and the Machine. Paris, France: Hermann et Cie. Cambridge, MA: MIT Press. (2nd rev. ed. 1961).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 The Author(s)

About this chapter

Cite this chapter

Malapi-Nelson, A. (2017). Cybernetic Tensions: Anatomy of a Collapse. In: The Nature of the Machine and the Collapse of Cybernetics. Palgrave Studies in the Future of Humanity and its Successors. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-54517-2_8

Download citation

Publish with us

Policies and ethics