Implementation of Hopfield Associative Memory with Evolutionary Algorithm and MC-Adaptation Rule for Pattern Storage

Conference paper
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 131)

Abstract

This paper describes the strategy for implementation of Hopfield neural network as associative memory with the genetic algorithm and the Monte Carlo-(MC-) adaptation rule for pattern storage. In the Hopfield type neural networks of associative memory, the appropriate arrangement of synaptic weights provides an associative memory feature in the network. The fixed-point stable state of this model represents the appropriate storage of the input patterns. The aim is to obtain the optimal weight matrix for efficient recalling of any prototype input pattern. The performance of the Hopfield neural network, especially the capacity and the quality of the recalling, can be greatly improved by making use of genetic algorithm and MC-adaptation rule. The experiments consider a neural network trained with multiple numbers of patterns using the Hebbian learning rule. In most cases, the recalling of patterns using genetic algorithm with MC-adaptation rule seems to give better results than the conventional hebbian rule, MC-adaptation rule and simple genetic algorithm recalling techniques.

Keywords

Hopfield neural network Genetic algorithm Monte-Carlo adaptation rule Pattern association 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hopfield, J.J.: Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proceedings of the National Academy Sciences, USA 79, 2554–2558 (1982)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Hopfield, J.J.: Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proceedings of the National Academy Sciences, USA 81, 3088–3092 (1984)CrossRefGoogle Scholar
  3. 3.
    Amit, D.J., Gutfreund, H., Somopolinsky, H.: Storing Infinite Number of Patterns in a Spin-glass Model of Neural Networks. Physical Review Letters 55(14), 461–482 (1985)CrossRefGoogle Scholar
  4. 4.
    Amit, D.J.: Modeling Brain Function: The World of Attractor Neural Networks, ch. 2, p. 58. Cambridge University Press, New York (1989)MATHGoogle Scholar
  5. 5.
    Haykin, S.: Neural Networks: A Comprehensive Foundation, ch. 14, p. 664. Prentice Hall, Upper Saddle River (1985)Google Scholar
  6. 6.
    Zhou, Z., Zhao, H.: Improvement of the Hopfield Neural Network by MC-Adaptation Rule. Chin. Phys. Lett. 23(6), 1402–1405 (2006)CrossRefGoogle Scholar
  7. 7.
    Zhao, H.: Designing Asymmetric Neural Networks with Associative Memory. Physical Review 70(6), 066137-4 (2004)Google Scholar
  8. 8.
    Kawamura, M., Okada, M.: Transient Dynamics for Sequence Processing Neural Networks. J. Phys. A: Math. Gen. 35, 253 (2002)MathSciNetMATHCrossRefGoogle Scholar
  9. 9.
    Amit, D.J.: Mean-field Ising Model and Low Rates in Neural Networks. In: Proceedings of the International Conference on Statistical Physics, Seoul, Korea, June 5-7, pp. 1–10 (1997)Google Scholar
  10. 10.
    Imada, A., Araki, K.: Genetic Algorithm Enlarges the Capacity of Associative Memory. In: Proc. of the Sixth International Conf. on Genetic Algorithms, pp. 413–420 (1995)Google Scholar
  11. 11.
    Hopfield, J.J., Tank, D.W.: Neural Computation of Decisions in Optimization Problems. Biological Cybernetics 55, 141–146 (1985)MathSciNetGoogle Scholar
  12. 12.
    Tank, D.W., Hopfield, J.J.: Simple Neural Optimization Networks: An A/D Converter. Signal Decision Circuit, and a Linear Programming Circuit. IEEE Trans. Circuits and Syst. 33(5), 533–541 (1986)CrossRefGoogle Scholar
  13. 13.
    Jin, T., Zhao, H.: Pattern Recognition using Asymmetric Attractor Neural Networks. Phys. Rev. E 72(6), 066111-7 (2005)Google Scholar
  14. 14.
    Kumar, S., Singh, M.P.: Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm. Computers and Mathematics with Applications 60(4), 1049–1057 (2010)MathSciNetMATHCrossRefGoogle Scholar
  15. 15.
    McEliece, R.J., Posner, E.C., Rodemich, E.R., Venkatesh, S.S.: The Capacity of the Hopfield Associative Memory. IEEE Trans. Information Theory IT-33(4), 461–482 (1987)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Hebb, D.: The Organization of Behaviour. In: A Neuropsychological Theory. Wiley, New York (1949)Google Scholar
  17. 17.
    Personnaz, L., Guyon, I., Dreyfus, G.: Information Storage and Retrieval in Spin-glas-like Neural Networks. Journal de Physique Letters 46, 359–365 (1985)CrossRefGoogle Scholar
  18. 18.
    Personnaz, L., Guyon, I., Dreyfus, G.: Collective Computational Properties of Neural Networks: New Learning Algorithms. Physical Review A 34, 4217–4228 (1986)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Wu, G.-K., Hong, Z.: Two-Layer Feedback Neural Networks with Associative Memories. Chin. Phys. Lett. 25(11), 3871–3874 (2008)CrossRefGoogle Scholar
  20. 20.
    Zhao, H., Jin, T.: A Global Algorithm for Training Multilayer Neural Networks, arXiv: physics/0607046 (2006)Google Scholar
  21. 21.
    Amari, S.: Learning and Statistical Inference. In: Arbib, M.A. (ed.) The Handbook of Brain Theory and Neural Networks, pp. 522–526. MIT Press, Cambridge (1993)Google Scholar

Copyright information

© Springer India Pvt. Ltd. 2012

Authors and Affiliations

  • Somesh Kumar
    • 1
  • Rajkumar Goel
    • 1
  • Manu Pratap Singh
    • 2
  1. 1.Noida Institute of Engineering and TechnologyGreater NoidaIndia
  2. 2.Dr. B. R. Ambedkar UniversityAgraIndia

Personalised recommendations