Advertisement

Learning Symbols by Neural Network

  • Yoshitsugu KakemotoEmail author
  • Shinichi Nakasuka
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 529)

Abstract

VSF−Network is a neural network model that learns dynamical patterns. It is hybrid neural network combining a chaos neural network and a hierarchical neural network. The hierarchical neural network learns patterns and the chaos neural network monitors behavior of neurons in the hierarchical neural network. In this paper, two theoretical backgrounds of VSF−Network are introduced. An incremental learning framework using chaos neural networks is introduced. The monitoring by chaos neural network is based on clusters generated by synchronous vibration. Using the monitoring results, redundant neurons in the hierarchical neural network are found and they are used for learning of new patters. The second background is about the pattern recognition by combining learned patterns. This is explained by code words expression used in multi-level discrimination. Through an experiment, both the incremental learning capability and the pattern recognition are shown.

Keywords

Middle Layer Associative Memory Connection Weight Code Word Incremental Learning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Inamura, T., Tanie, H., Nakamura, Y.: Proto-symbol development and manipulation in the geometry of stochastic model for motion generation and recognition. Technical Report NC2003-65, IEICE (2003)Google Scholar
  2. 2.
    Kadone, H., Nakamura, Y.: Symbolic memory of motion patterns using hierarchical bifurcations of attanctors in an associative memory model. J. Robot Soc. Jpn. 25, 249–258 (2007)CrossRefGoogle Scholar
  3. 3.
    Chandler, D.: Semiotics for Beginners. Routledge, London (1995)Google Scholar
  4. 4.
    Kakemoto, Y., Nakasuka, S.: Neural assembly generation by selective connection weight updating. In: Proceedings of IjCNN 2010 (2010)Google Scholar
  5. 5.
    Giraud-Carrier, C.: A note on the utility of incremental learning. AI Commun. 13, 215–223 (2000)zbMATHGoogle Scholar
  6. 6.
    Lin, M., Tang, K., Yao, X.: Incremental learning by negative correlation leaning. In: Proceedings of IJCNN 2008 (2008)Google Scholar
  7. 7.
    Hopfield, J.: Neurons with graded response have collective computational properties like those of two-stage neurons. Proc. Nat. Acad. Sci. U.S.A. 81, 13088–3092 (1984)Google Scholar
  8. 8.
    Aihara, T., Tanabe, T., Toyoda, M.: Chaotic neural networks. Phys. Lett. 144A, 333–340 (1990)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Kaneko, K.: Chaotic but regular posi-nega switch among coded attractor by cluster size variation. Phys. Rev. Lett. 63, 219 (1989)Google Scholar
  10. 10.
    Komuro, M.: A mechanism of chaotic itinerancy in globally coupled maps. In: Dynamical Systems (NDDS 2002) (2002)Google Scholar
  11. 11.
    LeCun, Y., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., Jackel, L.D.: Backpropagation applied to handwritten zip code recognition. Neural Comput. 1, 541–551 (1989)CrossRefGoogle Scholar
  12. 12.
    Sejnowski, T.J., Rosenberg, C.R.: Nettalk: a parallel network that learns to read aloud. In: Anderson, J.A., Rosenfeld, E. (eds.) Neurocomputing: Foundations of Research, pp. 661–672. MIT Press, Cambridge (1988)Google Scholar
  13. 13.
    Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. J. Artif. Intell. Res. 2, 263–286 (1995)zbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.The JSOL, Ltd.TokyoJapan
  2. 2.The University of TokyoTokyoJapan

Personalised recommendations