Increasing the capacity of a hopfield network without sacrificing functionality

  • Amos Storkey
Part III: Learning: Theory and Algorithms
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1327)


Hopfield networks are commonly trained by one of two algorithms. The simplest of these is the Hebb rule, which has a low absolute capacity of n/(2ln n), where n is the total number of neurons. This capacity can be increased to n by using the pseudo-inverse rule. However, capacity is not the only consideration. It is important for rules to be local (the weight of a synapse depends ony on information available to the two neurons it connects), incremental (learning a new pattern can be done knowing only the old weight matrix and not the actual patterns stored) and immediate (the learning process is not a limit process). The Hebbian rule is all of these, but the pseudo-inverse is never incremental, and local only if not immediate. The question addressed by this paper is, ‘Can the capacity of the Hebbian rule be increased without losing locality, incrementality or immediacy?’

Here a new algorithm is proposed. This algorithm is local, immediate and incremental. In addition it has an absolute capacity significantly higher than that of the Hebbian method: n/√2ln n.

In this paper the new learning rule is introduced, and a heuristic calculation of the absolute capacity of the learning algorithm is given. Simulations show that this calculation does indeed provide a good measure of the capacity for finite network sizes. Comparisons are made between the Hebb rule and this new learning rule.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Daniel J. Amit, H. Gutfreund, and H. Sompolinsky. Statistical mechanics of neural networks near saturation. Annals of Physics, 173(1):30–67, 1987.Google Scholar
  2. 2.
    S. Diederich and M. Opper. Learning of correlated patterns in spin-glass networks by local learning rules. Physical Review Letters, 58(9):949–952, 1987.Google Scholar
  3. 3.
    V. S. Dotsenko, N. D. Yarunin, and E. A. Dorotheyev. Statistical mechanics of Hopfield-like neural networks with modified interactions. Journal of Physics A: Mathematical General, 24(10):2419–2429, 1991.Google Scholar
  4. 4.
    J. J. Hopfield. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America: Biological Sciences, 79(8):2554–2558, 1982.Google Scholar
  5. 5.
    I. Kanter and H. Sompolinsky. Associative recall of memory without errors. Physical Review A-General Physics, 35(1):380–392, 1987.Google Scholar
  6. 6.
    R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh. The capacity of the Hopfield associative memory. IEEE Transactions on Information Theory, 33(4):461–482, 1987.Google Scholar
  7. 7.
    H. F. Yanai and S. I. Amari. Autoassociative memory with 2-stage dynamics of nonmonotonic neurons. IEEE Transactions on Neural Networks, 7(4):803–815, 1996.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Amos Storkey
    • 1
  1. 1.Neural Systems GroupImperial CollegeLondon

Personalised recommendations