Information and pattern capacities in neural associative memories with feedback for sparse memory patterns

  • Günther Palm
  • Friedrich T. Sommer
Part of the Perspectives in Neural Computing book series (PERSPECT.NEURAL)

Abstract

How to judge the performance of associative memories in applications? Using information theory, we examine the static structure of memory states and spurious states of a recurrent associative memory after learning. In this framework we consider the critical pattern capacity often used in the literature and introduce the information capacity as a more relevant performance measure for pattern completion. For two types of local learning rule, the Hebb and the clipped Hebb rule our method yields new asymptotic estimates for the information capacity.

Keywords

Stein 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    G. Palm, Computing with neural networks, Science 235, 1227–1228 (1987) CrossRefGoogle Scholar
  2. [2]
    J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc. Natl. Sci. 79, 2554–2558 (1982) CrossRefMathSciNetGoogle Scholar
  3. [3]
    G. Palm, On the asymptotic storage capacity on neural networks, Neural Computers, Ed. R. Eckmiller, Chr. v.d. Malsburg, Springer 1988 Google Scholar
  4. [4]
    J.-P. Nadal, G. Toulouse, Information storage in sparsely coded memory nets, Network 1, 61–74 (1990) CrossRefMATHMathSciNetGoogle Scholar
  5. [5]
    J.-P. Nadal, Associative memory: on the (puzzling) sparse coding limit, J. Phys A 24, 1093–1101 (1991) CrossRefMATHGoogle Scholar
  6. [6]
    M. V. Tsodyks, M. V. Feigelman, The enhanced storage capacity in neural networks with low activity level, Europhys. Lett. 6 (2), 101–105 (1988) CrossRefGoogle Scholar
  7. [7]
    E. Gardner, The space of interactions in neural network models, J. Phys. A 21, 257–270 (1988) CrossRefMathSciNetGoogle Scholar
  8. [8]
    H. Gutfreund, Y. Stein, Capacity of neural networks with discrete synaptic couplings, J. Phys A 23, 2613–2630 (1990) CrossRefMATHMathSciNetGoogle Scholar
  9. [9]
    G. Palm, Local learning rules and sparse coding in neural networks in: Advanced Neural Computers Ed: R. Eckmiller Elsevier Science Publishers B.V. (North Holland) (1990) Google Scholar
  10. [10]
    D.J. Willshaw, O.P. Buneman, H.C. Longuet-Higgins, Nonholographic associative memory, Nature (London) 222, 960–962 (1969) CrossRefGoogle Scholar
  11. [11]
    G. Palm, On associative memory, Biol. Cybern. 36, 19–31 (1980) CrossRefMATHGoogle Scholar
  12. [12]
    S. I. Amari, Statistical neurodynamics of associative memory, Neural Networks 1 63–73 (1989) CrossRefGoogle Scholar
  13. [13]
    D. J. Amit, H. Gutfreund, Statistical mechanics of neural networks near saturation, Ann. Phys. 173, 30–67 (1987) CrossRefGoogle Scholar
  14. [14]
    E. Gardner, H. Gutfreund, I. Yekutieli, The phase space of interactions in neural networks with definite symmetry, J. Phys. A 22, 1995–2008 (1989) CrossRefMathSciNetGoogle Scholar
  15. [15]
    H. Horner, Neural networks with low levels of activity: Ising vs. McCulloch-Pitts neurons, Z. Phys. B 75, 133–136 (1989) CrossRefGoogle Scholar
  16. [16]
    G. Palm, Information capacities of simple storage and retrieval procedures in neural networks, submitted to: J. of stat. Physics (1991) Google Scholar
  17. [17]
    G. Palm, On the information storage capacity of local learning rules, to be published in Neural Computation (1991) Google Scholar
  18. [18]
    F. Schwenker, F. Sommer, G. Palm, Simulations of recurrent networks of threshold neurons with sparse activity, in preparation Google Scholar

Copyright information

© Springer-Verlag London Limited 1992

Authors and Affiliations

  • Günther Palm
    • 1
  • Friedrich T. Sommer
    • 1
  1. 1.C. u. O. Vogt Institut für HirnforschungUniversity of DüsseldorfDüsseldorf 1Germany

Personalised recommendations