Journal of Computational Neuroscience

, Volume 30, Issue 3, pp 699–720

Capacity analysis in multi-state synaptic models: a retrieval probability perspective


DOI: 10.1007/s10827-010-0287-7

Cite this article as:
Huang, Y. & Amit, Y. J Comput Neurosci (2011) 30: 699. doi:10.1007/s10827-010-0287-7


We define the memory capacity of networks of binary neurons with finite-state synapses in terms of retrieval probabilities of learned patterns under standard asynchronous dynamics with a predetermined threshold. The threshold is set to control the proportion of non-selective neurons that fire. An optimal inhibition level is chosen to stabilize network behavior. For any local learning rule we provide a computationally efficient and highly accurate approximation to the retrieval probability of a pattern as a function of its age. The method is applied to the sequential models (Fusi and Abbott, Nat Neurosci 10:485–493, 2007) and meta-plasticity models (Fusi et al., Neuron 45(4):599–611, 2005; Leibold and Kempter, Cereb Cortex 18:67–77, 2008). We show that as the number of synaptic states increases, the capacity, as defined here, either plateaus or decreases. In the few cases where multi-state models exceed the capacity of binary synapse models the improvement is small.


Hebbian learning Network dynamics Inhibition Sparse coding 

Supplementary material

10827_2010_287_MOESM1_ESM.pdf (132 kb)
(PDF 131 KB)

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Department of StatisticsUniversity of ChicagoChicagoUSA
  2. 2.Departments of Statistics and Computer ScienceUniversity of ChicagoChicagoUSA

Personalised recommendations