Skip to main content

Backbone Structure of Hairy Memory

  • Conference paper
Artificial Neural Networks – ICANN 2006 (ICANN 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4131))

Included in the following conference series:

Abstract

This paper presents a new memory of the Hopfield model that fixes many drawbacks of the model, such as loading capacity, limit cycle and error tolerance. This memory is derived from the hairy model [15]. This paper also constructs a training process to further balance the vulnerable memory parts and improve the memory.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ackley, D.H., Hinton, G.E., Sejnowski, T.J.: A learning algorithm for Boltzmann machine. Cognitive Science 9, 147–169 (1985)

    Article  Google Scholar 

  2. Amari, S.I., Maginu, K.: Statistical Neurodynamics of Associative Memory. Neural Networks 1(1), 63–73 (1988)

    Article  Google Scholar 

  3. Amit, D.J.: Modeling brain function: The world of attractor neural networks. Cambridge University Press, Cambridge (1989)

    MATH  Google Scholar 

  4. Gardner, E., Derrida, B.: Optimal storages properties of neural network models. Journal of Physics A 21, 271–284 (1988)

    Article  MathSciNet  Google Scholar 

  5. Gardner, E.: Optimal basins of attraction in randomly sparse neural network models. Journal of Physics A 22(12), 1969–1974 (1989)

    Article  MathSciNet  Google Scholar 

  6. Hartwell, L.H., Hopfield, J.J., Leibler, S., Murray, A.W.: From molecular to modular cell biology. Nature, Suppl. 402, C47–C52 (1999)

    Article  Google Scholar 

  7. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational ability. Proceedings of the National Academy of Sciences of the United States of America 79, 2554–2558 (1982)

    Article  MathSciNet  Google Scholar 

  8. Kanter, I., Sompolinsky, H.: Associative recall of memory without errors. Physical Review A 35(1), 380–392 (1987)

    Article  Google Scholar 

  9. Kauffman, S.A.: Antichaos and adaptation, August, pp. 64–70. Scientific American (1991)

    Google Scholar 

  10. Li, J., Michel, A.N., Porod, W.: Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercube. IEEE Transactions on Circuits and Systems 36(11), 1405–1422 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  11. Liou, C.-Y., Lin, S.-L.: The other variant Boltzmann machine. In: Proceedings of International Joint Conference on Neural Networks, Washington DC, pp. 449–454 (1989)

    Google Scholar 

  12. Liou, C.-Y., Wu, J.-M.: Self-organization using Potts models. Neural Networks 9(4), 671–684 (1996)

    Article  Google Scholar 

  13. Liou, C.-Y., Yuan, S.-K.: Error tolerant associative memory. Biological Cybernetics 81, 331–342 (1999)

    Article  MATH  Google Scholar 

  14. Liou, C.-Y., Yang, H.-C.: Selective feature-to-feature adhesion for recognition of cursive handprinted characters. IEEE Transactions on Pattern Analysis and Machine Intelligence 21(2), 184–191 (1999)

    Article  Google Scholar 

  15. Liou, C.-Y., Lin, S.-L.: Finite memory loading in hairy neurons. Natural Computing 5(1), 15–42 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  16. Little, W.A.: The existence of persistent states in the brain. Mathematical Biosciences 19, 101–120 (1974)

    Article  MATH  Google Scholar 

  17. Personnaz, L., Guyon, I., Dreyfus, G.: Information storage and retrieval in spin-glass like neural networks. Journal Physique Lett. 46, 359–365 (1985)

    Article  Google Scholar 

  18. Weisbuch, G., Fogelman-Soulie, F.: Scaling laws for the attractors of Hopfield networks. Journal De Physique Lett. 46, 623–630 (1985)

    Article  Google Scholar 

  19. Widrow, B., Hoff Jr., M.E.: Adaptive switching circuits. IRE WESCON Convention Record, pp. 96–104 (1960)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Liou, CY. (2006). Backbone Structure of Hairy Memory. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4131. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840817_72

Download citation

  • DOI: https://doi.org/10.1007/11840817_72

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-38625-4

  • Online ISBN: 978-3-540-38627-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics