The Vector Decomposition Method

  • Anne-Johan Annema
Part of the The Springer International Series in Engineering and Computer Science book series (SECS, volume 314)

Abstract

When implementing neural networks in either software or hardware, one must have specifications for building blocks (either software functions or hardware circuits) in order to make a sufficiently good implementation. In this context, sufficiently good means that the minimum requirements for proper operation must be satisfied and that at the same time the various building blocks must not be over-specified as this is usually expensive (economically or computationally). To derive specifications for the various building blocks for neural networks, a number of ways can be followed. We list three possible approaches and give a short discussion of all.

Keywords

Neural Network Weight Vector Input Vector Input Space Vector Component 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    H. Guo and S.B. Gelfand, “Analysis of Gradient Descent Learning Algorithms for Multilayer Feedforward Neural Networks”, IEEE Trans. Circuits and Systems, vol. 38, 883–894, 1991MATHCrossRefGoogle Scholar
  2. [2]
    T.M. Heskes and B. Kappen, “Learning processes in neural networks”, Physical Review A, vol. 44, 2718–2726, 1991CrossRefGoogle Scholar
  3. [3]
    D.B. Parker, Learning Logic, Technical Report 47, MT, 1985Google Scholar

Copyright information

© Springer Science+Business Media New York 1995

Authors and Affiliations

  • Anne-Johan Annema
    • 1
  1. 1.MESA Research InstituteUniversity of TwenteNetherlands

Personalised recommendations