Using quadratic perceptrons to reduce interconnection density in multilayer neural networks

  • Dirk Röckmann
  • Claudio Moraga
Neural Network Theories, Neural Models
Part of the Lecture Notes in Computer Science book series (LNCS, volume 540)


Multilayer Perceptron Nets are one of the most well known architectures for Artificial Neural Networks. The high density of interconnections among neurons however, make their VLSI-realization extremely difficult. In this paper we introduce quadratic perceptrons and show that they may lead to substantial reduction in the number of required interconnections thus improving the adequacy for integration. A Multilayer Perceptron Net with quadratic perceptrons in the first layer may be trained by using backpropagation.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Bryson A., Ho Y.: "Applied Optimal Control". New York, Blaisdell, (1969)Google Scholar
  2. [2]
    Werbos, P.: Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Dissertation, Harvard University, Cambridge MA, (1974)Google Scholar
  3. [3]
    Parker D.: Learning Logic. Technical Report TR-47. Center for Computational Research in Economics and Management Sciences. MIT, Cambridge MA, (1985)Google Scholar
  4. [4]
    Rumelhart D., Hinton G., Williams R.: Learning Internal Representations by Error Propagation. In "Parallel Distributed Processing: Explorations in the Microstructure of Cognition". Rumelhart D. and McClelland J. (Eds.), Cambridge MA, MIT Press, (1986)Google Scholar
  5. [5]
    Owens A., Filkin D.: Efficient training of the Back Propagation Network by solving a System of Stiff Ordinary Differential Equations. Proceedings International Conference on Neural Networks, 381–386, Washington DC, 1989Google Scholar
  6. [6]
    Fahlman S.E.: Fast-Learning Variations on Back-Propagation: An Empirical Study. Proceedings of the 1988 Connectionist Models Summer School (Pittsburgh 1988), 38–51. D. Touretzky, G. Hinton and T. Sejnowski (Eds.). San Mateo, Morgan Kaufmann, (1989)Google Scholar
  7. [7]
    Hornik K., Stinchcombe M., White H.: Multilayer Feedforward Networks are Universal Approximators. Neural Networks 2 (5), 359–366, (1989)Google Scholar
  8. [8]
    Minsky M.L., Papert S.A.: "Perceptrons: An Introduction to Computational Geometry". Cambridge, MA, MIT Press, (1988)Google Scholar
  9. [9]
    Tietze U., Schenk Ch.: "Halbleiter-Schaltungstechnik". Berlion, Heidelberg, New York, Springer Verlag, (1980).Google Scholar
  10. [10]
    Hassoun M.H., Arrathoon R.: Logical Signal Processing with Optically Connected Threshold Gates. Optical Engineering. 25 (1), 56–68, (1986)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1991

Authors and Affiliations

  • Dirk Röckmann
    • 1
  • Claudio Moraga
    • 1
  1. 1.Dept. Computer ScienceUniversity of DortmundGermany

Personalised recommendations