A layered recurrent neural network for feature grouping
We describe a recurrent network, the Competitive Layer Model (CLM) for feature grouping. The model uses a combination of cooperative and competitive interactions to partition a set of input features into salient groups whose number is only restricted by the available layers. We give analytic results on convergence and the attractor states of the model and present simulation results showing grouping by proximity and grouping by symmetry and good continuation.
- Ritz, S. A., Anderson, J. A., Silverstein, J. W., Jones, R. S. (1977) Distinctive features, categorical perception, and probability learning: Some applications of a neural model. Psychological Review 84: pp. 413-451
- R. Lippmann. An introduction to computing with neural nets. IEEE ASSP Mag., pages 4–22, 1987.
- Luenberger, D. G. (1979) Introduction to dynamical systems: Theory, models, and applications. Wiley, New York
- Yeshurun, H. W., Reisfeld, D. (1995) Context-free attentional operators: The generalized symmetry transform. Int. Journal of Computer Vision 14: pp. 119-130
- H. Ritter. A spatial approach to feature linking. In Int. Neur. Netw. Conf. Paris, 1990.
- Ritz, R., Gerstner, W., Fuentes, U., Hemmen, J.L. (1994) A biological motivated and analytically soluble model of collective oscillations in the cortex. II. Application to binding and pattern segmentation. Biol. Cybern. 71: pp. 349-358
- C. v.d. Malsburg. The correlation theory of brain function. Technical Report 81-2, MPI Göttingen, 1981.
- A layered recurrent neural network for feature grouping
- Book Title
- Artificial Neural Networks — ICANN'97
- Book Subtitle
- 7th International Conference Lausanne, Switzerland, October 8–10, 1997 Proceeedings
- pp 439-444
- Print ISBN
- Online ISBN
- Series Title
- Lecture Notes in Computer Science
- Series Volume
- Series ISSN
- Springer Berlin Heidelberg
- Copyright Holder
- Additional Links
- Industry Sectors
- eBook Packages
To view the rest of this content please follow the download PDF link above.