Abstract
Beyond the digital neural networks of Chap. 16, the more radical mapping of brain-like structures and processes into VLSI substrates has been pioneered by Carver Mead more than 30 years ago [1]. The basic idea was to exploit the massive parallelism of such circuits and to create low-power and fault-tolerant information-processing systems. Neuromorphic engineering has recently seen a revival with the availability of deep-submicron CMOS technology, which allows for the construction of very-large-scale mixed-signal systems combining local analog processing in neuronal cells with binary signalling via action potentials. Modern implementations are able to reach the complexity-scale of large functional units of the human brain, and they feature the ability to learn by plasticity mechanisms found in neuroscience. Combined with high-performance programmable logic and elaborate software tools, such systems are currently evolving into user-configurable non-von-Neumann computing systems, which can be used to implement and test novel computational paradigms. The chapter introduces basic properties of biological brains with up to 200 Billion neurons and their 1014 synapses, where action on a synapse takes ∼10 ms and involves an energy of ∼10 fJ. We outline 10x programs on neuromorphic electronic systems in Europe and the USA, which are intended to integrate 108 neurons and 1012 synapses, the level of a cat’s brain, in a volume of 1 L and with a power dissipation <1 kW. For a balanced view on intelligence, we references Hawkins’ view to first perceive the task and then design an intelligent technical response.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Mead, C., Ismail M.: Analog VLSI Implementation of Neural Systems, ISBN 978-0-7923-9040-4, Springer (1989)
Chudler, E.H.: Neuroscience for kids, http://faculty.washington.edu/chudler/synapse.html (2009)
Stufflebeam, R.: Neurons, synapses, action potentials, and neurotransmission. The Mind Project, www.mind.ilstu.edu/curriculum/neurons_intro (2008)
Martini, F.H., Nath, J.L.: Fundamentals of Anatomy and Physiology, Chapter 12: Neural Tissue, Prentice-Hall (2008)
Sengupta, B., Stemmler, M., Laughlin, S.B., Niven, J.E.: Action potential energy efficiency varies among neuron types in vertebrates and invertebrates. PLoS Computat. Biol (2010). DOI: 10.1371/journal.pcbi.1000840
http://facets.kip.uni-heidelberg.de http://brainscales.kip.uni-heidelberg.de
Hawkins, J., Blakeslee, S.: On Intelligence. Times Books, Henry Holt, New York (2005)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Hoefflinger, B. (2011). Silicon Brains. In: Hoefflinger, B. (eds) Chips 2020. The Frontiers Collection. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23096-7_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-23096-7_18
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-22399-0
Online ISBN: 978-3-642-23096-7
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)