Information and pattern capacities in neural associative memories with feedback for sparse memory patterns
How to judge the performance of associative memories in applications? Using information theory, we examine the static structure of memory states and spurious states of a recurrent associative memory after learning. In this framework we consider the critical pattern capacity often used in the literature and introduce the information capacity as a more relevant performance measure for pattern completion. For two types of local learning rule, the Hebb and the clipped Hebb rule our method yields new asymptotic estimates for the information capacity.
Unable to display preview. Download preview PDF.
- G. Palm, On the asymptotic storage capacity on neural networks, Neural Computers, Ed. R. Eckmiller, Chr. v.d. Malsburg, Springer 1988 Google Scholar
- G. Palm, Local learning rules and sparse coding in neural networks in: Advanced Neural Computers Ed: R. Eckmiller Elsevier Science Publishers B.V. (North Holland) (1990) Google Scholar
- G. Palm, Information capacities of simple storage and retrieval procedures in neural networks, submitted to: J. of stat. Physics (1991) Google Scholar
- G. Palm, On the information storage capacity of local learning rules, to be published in Neural Computation (1991) Google Scholar
- F. Schwenker, F. Sommer, G. Palm, Simulations of recurrent networks of threshold neurons with sparse activity, in preparation Google Scholar