Journal of Statistical Physics

, Volume 1, Issue 2, pp 319–350 | Cite as

On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks

  • Stephen Grossberg
Articles

Abstract

Learning of patterns by neural networks obeying general rules of sensory transduction and of converting membrane potentials to spiking frequencies is considered. Any finite number of cellsA can sample a pattern playing on any finite number of cells ∇ without causing irrevocable sampling bias ifA = ℬ orA ∩ ℬ =
. Total energy transfer from inputs ofA to outputs of ℬ depends on the entropy of the input distribution. Pattern completion on recall trials can occur without destroying perfect memory even ifA = ℬ by choosing the signal thresholds sufficiently large. The mathematical results are global limit and oscillation theorems for a class of nonlinear functional-differential systems.

Key words

learning stimulus sampling nonlinear difference-differential equations global limits and oscillations flows on signed networks functional-differential systems energy-entropy dependence pattern completion recurrent and nonrecurrent anatomy sensory transduction rules ratio limit theorems 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    S. Grossberg, “Some networks that can learn, remember, and reproduce any number of complicated space-time patterns (I),”J. Math. Mechanics (July 1969).Google Scholar
  2. 2.
    S. Grossberg, “Some networks that can learn, remember, and reproduce any number of complicated space-time patterns (II),”SIAM J. Applied Math., submitted for publication.Google Scholar
  3. 3.
    S. Grossberg, “How do overlapping nonrecurrent synaptic fields learn without mutual interference?” (in preparation).Google Scholar
  4. 4.
    R. B. Livingston, “Brain mechanisms in conditioning and learning,” in:Neurosciences Research Symposium Summaries, F. O. Schmittet al., eds. (MIT Press, Cambridge, Massachusetts, 1967), Vol. 2.Google Scholar
  5. 5.
    S. Grossberg, “On neural pattern discrimination,”J, Theoret. Biol., submitted for publication.Google Scholar
  6. 6.
    S. Grossberg, “On learning, information, lateral inhibition, and transmitters,”Math. Biosci. 4:255–310 (1969).Google Scholar
  7. 7.
    S. Grossberg, “A prediction theory for some nonlinear functional-differential equations (II),”J. Math. Anal. Appl. 22:490–522 (1968).Google Scholar
  8. 8.
    P. Hartman,Ordinary Differential Equations (John Wiley and Sons, New York, 1964).Google Scholar
  9. 9.
    S. Grossberg, “A prediction theory for some nonlinear functional-differential equations (I),”J. Math. Anal. Appl. 21:643–694 (1968).Google Scholar
  10. 10.
    S. Grossberg, “Some nonlinear networks capable of learning a spatial pattern of arbitrary complexity,”Proc. Natl. Acad. Sci. (U.S.) 59:368–372 (1968).Google Scholar

Copyright information

© Plenum Publishing Corporation 1969

Authors and Affiliations

  • Stephen Grossberg
    • 1
  1. 1.Massachusetts Institute of TechnologyCambridge

Personalised recommendations