Pattern Learning by Functional-Differential Neural Networks with Arbitrary Path Weights
- 229 Downloads
This paper proves the universal theorem on associative learning that culminates my 1967–1972 articles on this subject. The theorem is universal in the following sense. It says that if my associative learning laws were invented at a prescribed time during the evolutionary process, then they could be used to guarantee unbiased associative learning in essentially any later evolutionary specialization. That is, the laws are capable of learning arbitrary spatial patterns in arbitrarily many, simultaneously active sampling channels that are activated by arbitrary continuous data preprocessing in an essentially arbitrary anatomy. The learning of arbitrary space-time patterns is also guaranteed given modest requirements on the temporal regularity of stimulus sampling, as in avalanches and generalizations thereof.
KeywordsConditioned Stimulus Associative Learning Pattern Learning Pattern Discrimination Path Weight
Unable to display preview. Download preview PDF.
- Grossberg, S., ‘A Neural Theory of Punishment and Avoidance, I. Qualitative Theory’, Math. Biosci., in press.Google Scholar
- Grossberg, S., ‘A Neural Theory of Punishment and Avoidance, II. Quantitative Theory’, Math. Biosci., in press.Google Scholar
- Kimble, G. A., Foundations of Conditions and Learning, Appleton-Century-Crafts, New York, 1967, p. 26.Google Scholar
- Grossberg, S., ‘Some Networks That Can Learn, Remember, and Reproduce any Number of Complicated Patterns, II’, Studies in Applied Math. XLIX (1970), 137.Google Scholar
- Ruch, T. C., H. D. Patton, J. W. Woodbury, and A. L. Towe, Neurophysiology, W. B. Saunders, Philadelphia, 1961.Google Scholar