Skip to main content

M-ary Hopfield Neural Network Based Associative Memory Formulation: Limit-Cycle Based Sequence Storage and Retrieval

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2021 (ICANN 2021)

Abstract

In this paper, we examine Hopfield network composed of multi-state neurons for storing sequence data as limit cycles of the network. Earlier, we had presented uni-modal data - particularly text, speech and audio data storage and retrieval in bipolar Hopfield based associative memory architecture. We extended this to multi-modal data and we demonstrated that Hopfield can indeed work as content addressable memory for multi-modal data. This paper is a step towards realising a more wider definition of multi-modality. We present a M-ary Hopfield associative memory model for storing limit cycle data. The proposed system uses a dual weight learning mechanism to exhibit limit cycle behavior in which sequence data can be stored and retrieved. We particularly deal with a) sequence of images and b) movie clip data as instances of limit cycle data. We also propose and use a two stage firing mechanism to retrieve the stored sequence data from the limit cycles. We present a trade-off behavior between the number of cycles and length of cycles the network can store and we demonstrate that the network capacity is still of the order of network size i.e., O(N) for limit cycle data. This represents a first of its kind attempt for sequence storage and retrieval in Hopfield network as limit-cycles, particularly with image-sequence and movie-content data of real-world scales.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Roenneberg, T., Chua, E.J., Bernardo, R., Mendoza, E.: Modelling biological rhythms. Curr. Biol. 18(17), R826–R835 (2008)

    Google Scholar 

  2. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. 79(8), 2554–2558 (1982)

    Article  MathSciNet  Google Scholar 

  3. Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Natl. Acad. Sci. 81(10), 3088–3092 (1984)

    Article  Google Scholar 

  4. Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall PTR, USA (1998)

    MATH  Google Scholar 

  5. Ladwani, V.M., Vaishnavi, Y., Ramasubramanian, V.: Hopfield auto-associative memory network for content-based text-retrieval. In: Proceedings of 26th International Conference on Artificial Neural Networks, ICANN-2017, Alghero, Italy (September 2017)

    Google Scholar 

  6. Ladwani, V.M., et al.: Hopfield net framework for audio search. In: Proceedings of the 23rd National Conference on Communications (NCC), NCC-2017, pp. 1–6 (2017)

    Google Scholar 

  7. Vaishnavi, Y., Shreyas, R., Suhas, S., Surya, U.N., Ladwani, V.M., Ramasubramanian, V.: Associative memory framework for speech recognition: adaptation of hopfield network. In: Proceedings of the IEEE Annual India Conference, INDICON-2016, pp. 1–6 (2016)

    Google Scholar 

  8. Shriwas, R., Joshi, P., Ladwani, V.M., Ramasubramanian, V.: Multi-modal Associative Storage and Retrieval Using Hopfield Auto-associative Memory Network. In: Tetko, I.V., Kurková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11727, pp. 57–75. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30487-4_5

    Chapter  Google Scholar 

  9. Hebb, D. O. the organization of behavior: a neuropsychological theory. New York: John Wiley and Sons, Inc., 1949. 335 p. Sci. Educ. 34(5), 336–337 (1950)

    Google Scholar 

  10. Wang, D.: The Handbook of Brain Theory and Neural Networks, 2nd edn. Choice Reviews Online, pp. 1163–1167 (2003)

    Google Scholar 

  11. Denizdurduran, B.: Attractor Neural Network Approaches in the Memory Modeling (2012)

    Google Scholar 

  12. Miyoshi, S., Nakayama, K.: A recurrent neural network with serial delay elements for memorizing limit cycles. In: Proceedings of the International Conference on Neural Networks, ICNN 1995. IEEE (1995)

    Google Scholar 

  13. Miyoshi, S., Yanai, H.-F., Okada, M.: Associative memory by recurrent neural networks with delay elements. Neural Netw. 17(1), 55–63 (2004)

    Article  Google Scholar 

  14. Maurer, A., Hersch, M., Billard, A.G.: Extended hopfield network for sequence learning: application to gesture recognition. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds.) ICANN 2005. LNCS, vol. 3696, pp. 493–498. Springer, Heidelberg (2005). https://doi.org/10.1007/11550822_77

    Chapter  Google Scholar 

  15. Zhang, C., Dangelmayr, G., Oprea, I.: Storing cycles in hopfield-type networks with pseudoinverse learning rule: admissibility and network topology. Neural Netw. 46, 283–298 (2013)

    Article  Google Scholar 

  16. Plummer, B.A., Wang, L., Cervantes, C.M., Caicedo, J.C., Hockenmaier, J., Lazebnik, S.: Flickr30k entities: collecting region-to-phrase correspondences for richer image-to-sentence models. IJCV 123(1), 74–93 (2017)

    Article  MathSciNet  Google Scholar 

  17. Bijjani, R., Das, P.: An M-ary neural network model. Neural Comput. 2(4), 536–551 (1990)

    Article  Google Scholar 

  18. Rolls, E.T., Treves, A., Foster, D., Perez-Vicente, C.: Simulation studies of the CA3 hippocampal subfield modelled as an attractor neural network. Neural Netw. 10(9), 1559–1569 (1997)

    Article  Google Scholar 

  19. Personnaz, L., Guyon, I., Dreyfus, G.: Collective computational properties of neural networks: new learning mechanisms. Phys. Rev. A 34(5), 4217 (1986)

    Article  MathSciNet  Google Scholar 

  20. Zlatintsi, A., et al.: COGNIMUSE: a multimodal video database annotated with saliency, events, semantics and emotion with application to summarization. EURASIP J. Image Video Process. 2017(1), 1–24 (2017). https://doi.org/10.1186/s13640-017-0194-1

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. Ramasubramanian .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ladwani, V.M., Ramasubramanian, V. (2021). M-ary Hopfield Neural Network Based Associative Memory Formulation: Limit-Cycle Based Sequence Storage and Retrieval. In: Farkaš, I., Masulli, P., Otte, S., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2021. ICANN 2021. Lecture Notes in Computer Science(), vol 12894. Springer, Cham. https://doi.org/10.1007/978-3-030-86380-7_34

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-86380-7_34

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86379-1

  • Online ISBN: 978-3-030-86380-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics