Advertisement

Neural Networks with Feedback and Self-organization

  • Mikhail Z. ZgurovskyEmail author
  • Yuriy P. Zaychenko
Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 652)

Abstract

In this chapter another classes of neural networks are considered as compared with feed-forward NN—NN with back feed and with self-organization. In Sect. 2.1 recurrent neural network of Hopfield is considered, its structure and properties are described. The method of calculation of Hopfield network weights is presented and its properties considered and analyzed. The results of experimental investigations for application Hopfield network for letters recognition under high level of noise are described and discussed. In the Sect. 2.2 Hamming neural network is presented, its structure and properties are considered, algorithm of weights adjusting is described. The experimental investigations of Hopfield and Hamming networks in the problem of characters recognition under different level of noise are presented. In the Sect. 2.3 so-called self-organizing networks are considered. At the beginning Hebb learning law for neural networks is described. The essence of competitive learning is considered. NN with self-organization by Kohonen are described. The basic competitive algorithm of Kohonen is considered/ its properties are analyzed. Modifications of basic Kohonen algorithm are described and analyzed. The modified competitive algorithm with neighborhood function is described. In the Sect. 2.4 different applications of Kohonen neural networks are considered: algorithm of neural gas, self-organizing feature maps (SOMs), algorithms of their construction and applications.

Keywords

Neural Network Weight Vector Input Vector Input Space Learning Pattern 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Hopfield, J.J.: Neural Networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA, 79, 2554—2558 (1982)Google Scholar
  2. 2.
    Zaychenko, Y.P.: Fundamentals of intellectual systems design. Kiev. Publishing House, “Slovo”, pp. 352 (2004) (rus)Google Scholar
  3. 3.
    Chung, F.L., Lee, T.: Fuzzy competitive learning. Neural Netw. 7, 539–552 (1994)Google Scholar
  4. 4.
    Deb, K., Joshi, D., Anand. A.: Real-coded evolutionary algorithms with parent-centric recombination. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 61–66, (2002)Google Scholar
  5. 5.
    Hopfield, J.J.: Neurons, dynamics and computation. Phys. Today, 47, 40–46 (1994)Google Scholar
  6. 6.
    Heykin, S.: Neural networks. Full course. (2nd edn). Transl. engl. Moscow.-Publishing House “Williams”. pp. 1104 (2006). (rus)Google Scholar
  7. 7.
    Hebb, D.O.: The Organization of Behavior: A Neuropsychological Theory. Wiley, New York (1949)Google Scholar
  8. 8.
    Kohonen, T.: Self-Organization and Associative Memory. (3rd edn). Springer, New York (1988)Google Scholar
  9. 9.
    Osovsky, S.: Neural networks for information processing, transl. from pol.—M.: Publishing house Finance and Statistics. pp. 344 (2002). (rus)Google Scholar
  10. 10.
    Kohonen, T.: Self-organized formation of topologically correct feature maps. Biol. Cybern. 43, 59–69 (1982)Google Scholar
  11. 11.
    Engelbrecht, A.: Computational Intelligence. An Introduction (2nd edn). John Wiley & Sons, Ltd., pp. 630 (2007)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.National Technical University of UkraineKievUkraine
  2. 2.Institute for Applied System AnalysisNational Technical University of Ukraine “Kiev Polytechnic Institute”KievUkraine

Personalised recommendations