Natural Computing

, Volume 5, Issue 1, pp 15–42 | Cite as

Finite Memory Loading in Hairy Neurons

  • Cheng-Yuan LiouEmail author
  • Shiao-Lin Lin


This paper presents a method to expand the basins of stable patterns in associative memory. It examines fully-connected associative memory geometrically and translate the learning process into an algebraic optimization procedure. It finds that locating all the patterns at certain stable corners of the neurons’ hypercube as far from the decision hyperplanes as possible can produce excellent error tolerance. It then devises a method based on this finding to develop the hyperplanes. This paper further shows that this method leads to the hairy model, or the deterministic analogue of the Gibb’s free energy model. Through simulations, it shows that this method gives better error tolerance than does the Hopfield model and the error-correction rule in both synchronous and asynchronous modes.


associative memory error-correction rule Gibb’s free energy hairy model Hopfield network Little model music perception neural network spin glass model 



associative memory


expanded associative memory


error-correction rule


Little model


Runge-Kutta method


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Ackley, DH, Hinton, GE, Sejnowski, TJ. 1985A learning algorithm for Boltzmann machineCognitive Science.9147169CrossRefGoogle Scholar
  2. Amari, SI. 1972Learning patterns and pattern sequences by self-organising netsIEEE Transactions on Computers.2111971206zbMATHMathSciNetGoogle Scholar
  3. Boser B, Guyon I, Vapnik VN. (1992). A training algorithm for optimal margin classifiers. In: Fifth Annual Workshop on Computational Learning Theory, Morgan Kaufmann, pp. 144–152Google Scholar
  4. Bruck, J. 1990On the convergence properties of the Hopfield modelProceeding of IEEE.7815791585Google Scholar
  5. Cover, TM. 1965Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognitionIEEE Transactions on Electronic Computers.14326334zbMATHGoogle Scholar
  6. Gardner, E. 1987Maximum storage capacity in neural networksElectrophysics Letters.4481485Google Scholar
  7. Gardner, E. 1989Optimal basins of attraction in randomly sparse neural network modelsJournal of Physics. A2219691974MathSciNetGoogle Scholar
  8. Hebb, DO. 1949The Organization of Behavior: A Neuropsychological TheoryWileyNew YorkGoogle Scholar
  9. Hopfield, JJ. 1982Neural networks and physical systems with emergent collective computational abilityProceeding of the National Academy of Science.7925542558MathSciNetGoogle Scholar
  10. Ince DC. (1992). Intelligent machinery. In: Ince DC. (eds). Collected Works of A. M. Turing: Mechanical Intelligence. Elsevier Science PublishersGoogle Scholar
  11. Kanter, I, Sompolinsky, H. 1987Associative recall of memory without errorsPhysics Review. A35380392Google Scholar
  12. Kauffman SA (1991) Antichaos and adaptation. Scientific American: 64–70Google Scholar
  13. Li, J, Michel, AN, Porod, W. 1989Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercubeIEEE Transactions on Circuits and Systems.3614051422CrossRefMathSciNetGoogle Scholar
  14. Liou CY and Lin SL (1989) The other variant Boltzmann machine. In: Proceedings of the IJCNN, Washington DC, pp. 449–454Google Scholar
  15. Liou CY and Sou UC. (2003) Loading temporal associative memory using the neuronic equation. In: Kaynak O, Alpaydin E, Oja E and Xu L (eds), LCS, vol. 2714, Springer, pp. 52–59Google Scholar
  16. Liou, CY, Wu, JM. 1996Self-organization using Potts modelsNeural Networks.9671684CrossRefGoogle Scholar
  17. Liou CY and Yu WJ (1995) Ambiguous binary representation in multilayer neural network. In: Proceedings of the ICNN, Perth, Australia, vol. 1, pp. 379–384Google Scholar
  18. Liou, CY, Yuan, SK. 1999Error tolerant associative memoryBiological Cybernetics.81331342CrossRefGoogle Scholar
  19. Little, WA. 1974The existence of persistent states in the brainMathematical Biosciences.19101120CrossRefzbMATHGoogle Scholar
  20. Mceliece, RJ, Posner, EC, Rodemich, ER, Venkatesh, SS. 1987The capacity of the Hopfield associative memoryIEEE Transaction on information Theory.33461482MathSciNetGoogle Scholar
  21. Szu H. (1989). Reconfigurable neural nets by energy convergence learning principle based on extended McCulloch–Pitts neurons and synapses. In: Proc IJCNN, Washington, DC, vol. 1, pp. 485–496Google Scholar
  22. Szu, H. 1999Thermodynamics energy for both supervised and unsupervised learning neural nets at a constant temperatureInternational Journal of Neural System.9175186Google Scholar
  23. Tao, Q, Fang, T, Qiao, H. 2001A novel continuous-time neural network for realizing associative memoryIEEE Transactions on Neural Networks.12418423Google Scholar
  24. Widrow B and Hoff ME Jr. (1960) Adaptive switching circuits. In: IRE WESCON Convention Record, pp. 96–104Google Scholar
  25. Wilde, PD. 1997The magnitude of the diagonal elements in neural networksNeural Networks.10499504CrossRefGoogle Scholar

Copyright information

© Springer 2006

Authors and Affiliations

  1. 1.Department of Computer Science and Information EngineeringNational Taiwan UniversityTaipeiROC
  2. 2.M.D., Health CenterNational Taiwan Normal UniversityTaipeiROC

Personalised recommendations