Skip to main content

Introduction

  • Chapter
  • 61 Accesses

Part of the book series: The Springer International Series in Engineering and Computer Science ((SECS,volume 127))

Abstract

Recent advances in neural sciences and microelectronic technologies have provided an excellent means to boost the computational capability and efficiency of complicated engineering tasks by several orders of magnitude [1]. Due to the optimized structure for artificial intelligence applications, a neural computer is considered as the most promising sixth-generation computing machine. The interdisciplinary nature of neural network studies spans many science and engineering fields including neuroscience, cognitive science, psychology, computer science, physics, mathematics, electrical engineering, and biomEdical engineering. Many digital neural coprocessors, which usually interface to personal computers and engineering workstations, are commercialized for accelerating neuro-computation. Typical products include ANZA from Hecht-Nielson Co., SAIC from Sigma, Odyssey from Texas Instruments Inc., and Mark III/IV from TRW Inc. [2]. However, a general-purpose digital neural coprocessor is usually much slower than a special-purpose analog neural hardware which implements the neural network in an optimal fashion.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. R. P. Lippman, “An introduction to computing with neural nets,” IEEE Acoustics, Speech, and Signal processing Magazine, pp. 4–22, April 1987.

    Google Scholar 

  2. R. Hecht-Nielsen, “Neural-computing: picking the human brain,” IEEE Spectrum, vol. 25, no. 3, pp. 36–41, Mar. 1988.

    Article  Google Scholar 

  3. J. A. Anderson and E. Rosenfeld, Neuralcomputing — Foundation of Research, Cambridge, MA: The MIT Press, 1988.

    Google Scholar 

  4. S. Grossberg, Neural Network and Natural Intelligence, Cambridge, MA: The MIT Press, 1988.

    Google Scholar 

  5. W. S. McCulloch, W. Pitts, “A logical calculus of the idea immanent in neural nets,” Bulletin of Mathematical Biophysics, vol. 5, pp. 115–133, 1943.

    Article  MathSciNet  MATH  Google Scholar 

  6. F. Rosenblatt, Principles of neurodynamics: perceptrons and the theory of brain mechanisms, Spartan Books, Washington D.C., 1961.

    Google Scholar 

  7. B. Widrow, Bernard, Hoff, and Marcian, “Adaptive switching circuits,” 7960 IRE WESCON Convention Record, Part 4, pp. 96–104, Aug. 23–26, 1960.

    Google Scholar 

  8. J. A. Anderson, “A simple neural network generating an interactive memory,” Mathematical Biosciences, vol. 14, pp. 197–220, 1972.

    Article  MATH  Google Scholar 

  9. J. J. Hopfield, “Neural network and physical systems with emergent collective computational abilities,” Proc. Natl. Acad., Sci. USA., vol. 79, pp. 2554–2558, Apr. 1982.

    Article  MathSciNet  Google Scholar 

  10. B. Kosko, “Adaptive bidirectional associative memories,” Applied Optics, vol. 36, pp. 4947–4960, Dec. 1987.

    Article  Google Scholar 

  11. D. Hebb, The Organization of Behavior, New York: Wiley, 1949.

    Google Scholar 

  12. T. Kohonen, Self-Organization and Associative Memory, 2nd Ed., New York: Springer-Verlag, 1987.

    Google Scholar 

  13. R. Hecht-Nielsen, “Counter-propagation networks,” Proc. of IEEE First Inter. Conf. on Neural Networks, vol. II, pp. 19–32, San Diego, CA, 1987.

    Google Scholar 

  14. A. H. Klopf, “A drive-reinforcement model of single neuron function: an alternative to the Hebbian neural model,” Proc. of Conf. on Neural Networks for Computing, pp. 265–270, Snowbird, UT, Apr. 1986.

    Google Scholar 

  15. G. E. Hinton and T. J. Sejnowski, “A learning algorithm for Boltzmann machines,” Cognitive Science, vol. 9, pp. 147–169, 1985.

    Article  Google Scholar 

  16. Y. P. Tsividis, “Analog MOS integrated circuits — certain new ideas, trends, and obstacles,” IEEE Jour. of Solid-State Circuits, vol. SC-22, no. 3, pp. 317–321, June 1987.

    Article  Google Scholar 

  17. H. P. Graf and P. de Vegvar, “A CMOS implementation of a neural network model,” Proc. of the Stanford Advanced Research in VLSI Conference, pp. 351–362, Cambridge, MA: The MIT Press, 1987.

    Google Scholar 

  18. B. W. Lee and B. J. Sheu, “Design of a neural-based A/D converter using modified Hopfield network,” IEEE Jour. of Solid-State Circuits, vol. SC-24, no. 4, pp. 1129–1135, Aug. 1989.

    Article  Google Scholar 

  19. Dahlquist, Björck, and Anderson, Numerical Methods, pp. 269–273, Englewood Cliffs, NJ: Prentice-Hall, 1974.

    Google Scholar 

  20. J. J. Hopfield, “Neurons with graded response have collective computational properties like those of two-state neurons,” Proc. Natl. Acad., Sci. U.S.A., vol. 81, pp. 3088–3092, May 1984.

    Article  Google Scholar 

  21. P. Treleaven, M. Pacheco, and M. Vellasco, “VLSI architectures for neural networks,” IEEE Micro Magazine, vol. 9, no. 6, pp. 8–27, Dec. 1989.

    Article  Google Scholar 

  22. C. A. Mead, Analog VLSI and Neural Systems, New York: Addison-Wesley, 1989.

    Book  MATH  Google Scholar 

  23. R. E. Howard, D. B. Schwartz, J. S. Denker, R. W. Epworth, H. P. Graf, W. E. Hubbard, L. D. Jackel, B. L. Straughn, and D. M. Tennant, “An associative memory based on an electronic neural network architecture,” IEEE Trans. on Electron Devices, vol. ED-34, no. 7, pp. 1553–1556, July 1987.

    Article  Google Scholar 

  24. P. Mueller, J. V. D. Spiegel, D. Blackman, T. Chiu, T. Clare, C. Donham, T. P. Hsieh, M. Loinaz, “Design and fabrication of VLSI components for a general purpose analog neural computer,” in Analog VLSI Implementation of Neural Systems, Editors: C. Mead and M. Ismail, Boston, MA: Kluwer Academic, pp. 135–169, 1989.

    Chapter  Google Scholar 

  25. M. Holler, S. Tarn, H. Castro, R. Benson, “An electrically trainable artificial neural network (ETANN) with 10240 ‘float gate’ synapses,” Proc. of IEEE/INNS Inter. Joint Conf. on Neural Networks, vol. II, pp. 191–196, Washington D.C., June 1989.

    Article  Google Scholar 

  26. T. Morishita, Y. Tamura, and T. Otsuki, “A BiCMOS analog neural network with dynamically updated weights,” Tech. Digest of IEEE Inter. Solid-State Circuits Conf., pp. 142–143, San Fransisco, CA, Feb. 1990.

    Google Scholar 

  27. A. F. Murray, “Pulse arithmetic in VLSI neural network,” IEEE Micro Magazine, vol. 9, no. 6, pp. 64–74, Dec. 1989.

    Article  Google Scholar 

  28. D. E. Van den Bout and T. K. Miller III, “A digital architecture employing stochasticism for the simulation of Hopfield neural nets,” IEEE Trans. on Circuits and Systems, vol. 36, no. 5, pp. 732–746, May 1989.

    Article  Google Scholar 

  29. M. S. Tomlinson Jr., D. J. Walker, M. A. Sivilotti, “A digital neural network architecture for VLSI,” Proc. of IEEE/INNS Inter. Joint Conf. on Neural Networks, vol. II, pp. 545–550, San Diego, CA, June 1990.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1991 Springer Science+Business Media New York

About this chapter

Cite this chapter

Lee, B.W., Sheu, B.J. (1991). Introduction. In: Hardware Annealing in Analog VLSI Neurocomputing. The Springer International Series in Engineering and Computer Science, vol 127. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-3984-1_1

Download citation

  • DOI: https://doi.org/10.1007/978-1-4615-3984-1_1

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4613-6780-2

  • Online ISBN: 978-1-4615-3984-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics