Advertisement

Self-organizing Flexible Neural Network

  • Mohammad Teshnehlab
  • Keigo Watanabe
Chapter
  • 210 Downloads
Part of the International Series on Microprocessor-Based and Intelligent Systems Engineering book series (ISCA, volume 19)

Abstract

In 1949 Hebb [1] proposed ANN structures that could build and organize themselves in accordance to output data, in an unsupervised manner during the learning process. Later, in 1981 Kohonen [2] demonstrated that ANN systems could be built so as to organize input data without being supervised or taught in any learning process. Such a system was able to perform the mapping of an external signal space into the system’s internal representational space, without human intervention. Kohonen called this process self-organizing and showed how it could be performed by an NN. Meanwhile, Grossberg [3] demonstrated instar/outstar functions similar to the Kohonen system. The network output is the weighted sum of the Kohonen layer outputs. Later, Carpenter and Grossberg [4] proposed a pattern recognition method by a self-organizing NN. There are some differences between these two learning rules; one is that in the Hebbian learning rule the output data is involved in the learning process, while in the Grossberg learning rule the input data is involved in the learning process. The other is that, in Hebbian approach any kind of SFs can be used, while in the Grossberg approach the only monotonic(continuous) increasing function can be used. These methods are based upon an unsupervised learning or self-organizing technique. Generally, in the supervised learning techniques [7]–[17] the error back-propagation to the NN adjusts the connection weights until the error set satisfies an acceptably low level. The existing unsupervised learning methods describe how the connection weights evolve in time with locally available information. Locality allows the connection weights to operate in real time. The dependence on the local information is important in a few ways. First, the learning method grows from local interconnection between neighboring cells rather than some intrinsically complicated phenomenon operating remotely. In biological neurons, there are external events going on in its immediate vicinity, such as the action of Gail cells, the biochemical changes, and background physical effects such as mechanical vibrations, temperature changes and electromagnetic flux. Kosko [5] combined all of these as noise. Second, in biological nervous systems the dependence on local information, only, implicitly increases the redundancy and fault tolerance, because there is no central unit to oversee or correct the operation of the network. Therefore, no explicitly vital element is existent as a network controller.

Keywords

Learning Rule Connection Weight Unsupervised Learning Uniform Random Number Compute Torque 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    D. O. Hebb, “The organization of behavior,” A Neuropsychological Theory, New Yory, John Wiley, 1949.Google Scholar
  2. [2]
    T. Kohonen, “Self-organization map,” Proceeding of the IEEE, Vol. 78, No. 9, pp. 1464–1480, 1990.CrossRefGoogle Scholar
  3. [3]
    S. Grossberg, “Classical and instrumental learning by neural networks,” Progress in Theoretical Biology, Vol. 3, pp. 51–141, New York: Academic press, 1974.Google Scholar
  4. [4]
    G. A. Carpenter and S. Grossberg, “The ART of adaptive pattern recognition by a self-organizing neural network,” IEEE Trans. on Computer, Vol. 37, No. 3, pp. 77–88, March 1988.Google Scholar
  5. [5]
    B. Kosko, “Unsupervised learning in noise,”Proceedings of the 1989 International Joint Conference on Neural Networks, Vol. I, pp. 7–17, June 1989.Google Scholar
  6. [6]
    F. Rosenblatt, “The perceptron: A Probabilistic nodel for information storage and organization in the Brain,” Psych. rev. 65, pp. 386–408, 1958.MathSciNetCrossRefGoogle Scholar
  7. [7]
    K. Fukushima, S. Miyake and T. Ito, “Neocognitron: a neural network model for a mechanism of visual pattern recognition,” IEEE Trans. on Systems, Man, and Cybernetics, Vol. SMC- 13, pp. 826–834, 1983.CrossRefGoogle Scholar
  8. [8]
    D. E. Rumelhart, G. E. Hinton and R. J. Williams, “Learning internal representations by error propagation,” in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1, D. E. Rumelhart, J. L. McClelland and the PDP Research Group, Eds., Cambridge, MA: MIT Press, pp. 318–362, 1982.Google Scholar
  9. [9]
    D. E. Rumehart, G. E. Hinton and J. L. McClelland, “A general framework for parallel distributed processing,” in Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1, D. E. Rumelhart, J. L. McClelland and the PDP Research Group, Eds., Cambridge, MA: MIT Press, pp. 45–76, 1986.Google Scholar
  10. [10]
    M. Kawato, Y. Uno. M. Isobe and Suzuki, “Hierarchical neural network model for voluntray movment with application to robotics,” IEEE Control Systems Magazine, pp. 8–15, May 1988.Google Scholar
  11. [11]
    D. Psaltis, A. Sideris and Yamamura, “A multilayered neural network controller,” IEEE Control Systems Magazine, pp. 17–20, April 1988.Google Scholar
  12. [12]
    T. J. Sejnowski and C. R. Rosenberg, “NETtalk: a parallel network that learns to read aloud,” The Johns Hopkins University electrical engineering and computer science technical report, JHU/EECS-86/01, 1986.Google Scholar
  13. [13]
    J. A. Feldman and D. H. Ballard, “Connectionist models and their properties,” Cognitive Science, Vol. 6, pp. 205–254, 1982.CrossRefGoogle Scholar
  14. [14]
    J. Yuh, “A neural net controller robotic vehicles, ” IEEE Trans. on Ocean Engng., Vol. 15, pp. 161–166, 1990.CrossRefGoogle Scholar
  15. [15]
    M. Jamshidi, B. Horne and N. Vadiee, “A neural network-based controller for a two-link robot,” Proc. 29th, Conf. Decision and Control, pp. 3256–3257, 1990.Google Scholar
  16. [16]
    P. J. Werbos, “Neural networks for control and system identification,” Proc. 28th Conf. Decision and Control, pp. 260–265, 1989.Google Scholar
  17. [17]
    R. M. Scanner and D. L. Akin, “Neuromorphic pitch attitude regulation of an underwater telerobot,” IEEE Control Systems Magazine, pp. 62–68, 1990.Google Scholar
  18. [18]
    M. Teshnehlab and K. Watanabe, “Intelligent control based on flexible neural networks,” Report of the Faculty of Science and Engineering of Saga University, Vol. 23 No. 2, pp. 1–23, 1995.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 1999

Authors and Affiliations

  • Mohammad Teshnehlab
    • 1
  • Keigo Watanabe
    • 2
  1. 1.Faculty of Electrical EngineeringK.N. Toosi UniversityTehranIran
  2. 2.Department of Mechanical EngineeringSaga UniversityJapan

Personalised recommendations