Dynamic learning — An approach to forgetting in ART2 neural networks

  • Anatoly Nachev
  • Niall Griffith
  • Alexander Gerov
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1480)


In machine learning “forgetting” little used or redundant information can be seen as a sensible strategy directed at the overall management of specific and limited computational resources. This paper describes new learning rules for the ART2 neural network model of category learning that facilitates forgetting without additional node features or subsystems and which preserves the main characteristics of the classic ART2 model. We consider that this approach is straightforward and is arguably biological plausible. The new learning rules drop the specification within the classic ART2 model that learning should only occur at the winning node. Classic ART2 learning rules are presented as a particular case of these new rules. The model increases system adaptability to continually changing or complex input domains. This allows the system to maintain information in a manner which is consistent with its use and allows system resources to be dynamically allocated in a way that is consistent with observations made of biological learning.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Baddley, A.: The Psychology of Memory. New York Basis Books (1976)Google Scholar
  2. 2.
    Carpenter G., Grossberg S.: ART2: Self-Organization of Stable Category Recognition Codes for Analog Input Patterns. Applied Optics 26 (1987)4916–4930Google Scholar
  3. 3.
    Corsini R.: Encyclopaedia of Psychology. John Wiley & Sons, vol. 2 (1984)Google Scholar
  4. 4.
    Ebbinghaus H.: Memory: A Contribution to Experimental Psychology. New York Dover (1964)Google Scholar
  5. 5.
    Fritzke B.: Unsupervised Clustering with Growing Cell Structures. Proc. of the IJCNN'91 Seattle (IEEE) (1991)Google Scholar
  6. 6.
    Fritzke B.: Let It Grow — Self-Organizing Feature Map with Problem Dependent Cell Structure. Proc. of the ICANN'91 Helsinki (1991)Google Scholar
  7. 7.
    Grossberg S.: Adaptive Pattern Classification and Universal Recoding. II: Feedback, Expectation, Olfaction and Illusion. Biol. Cybern. 23 (1976) 187MATHMathSciNetCrossRefGoogle Scholar
  8. 8.
    Grossberg S.: How Does a Brain Build a Cognitive Code? Psychological Review. 1 (1980) 1–51CrossRefGoogle Scholar
  9. 9.
    Hebb D.: The Organization and Behaviour. New York: Witey (1949)50Google Scholar
  10. 10.
    Keuchel H.: Putcamer E., Zimmer U.: Learning and Forgetting Surface Classification with Dynamic Neural Networks. Proc. of the ICANN'93. Amsterdam IX (1993)Google Scholar
  11. 11.
    Kohonen T.: Statistical Pattern Recognition Revisited. Advanced Network Computers R. Eckmiller (ed.) (1990)Google Scholar
  12. 12.
    Kohonen T.: Self-Organization and Associative Memory. Springer Verlag Berlin (1984)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • Anatoly Nachev
    • 1
  • Niall Griffith
    • 2
  • Alexander Gerov
    • 3
  1. 1.Shoumen UniversityShoumenBulgaria
  2. 2.University of LimerickLimerickIreland
  3. 3.IMI, BASSofiaBulgaria

Personalised recommendations