Skip to main content
Log in

A Bio-Inspired Incremental Learning Architecture for Applied Perceptual Problems

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

We present a biologically inspired architecture for incremental learning that remains resource-efficient even in the face of very high data dimensionalities (>1000) that are typically associated with perceptual problems. In particular, we investigate how a new perceptual (object) class can be added to a trained architecture without retraining, while avoiding the well-known catastrophic forgetting effects typically associated with such scenarios. At the heart of the presented architecture lies a generative description of the perceptual space by a self-organized approach which at the same time approximates the neighborhood relations in this space on a two-dimensional plane. This approximation, which closely imitates the topographic organization of the visual cortex, allows an efficient local update rule for incremental learning even in the face of very high dimensionalities, which we demonstrate by tests on the well-known MNIST benchmark. We complement the model by adding a biologically plausible short-term memory system, allowing it to retain excellent classification accuracy even under incremental learning in progress. The short-term memory is additionally used to reinforce new data statistics by replaying previously stored samples during dedicated “sleep” phases.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Bordes A, Bottou L. The Huller: a simple and efficient online SVM. In: Proceedings of the 16th European conference on machine learning (ECML). 2005.

  2. Syed A, Liu H, Sung KK. Incremental learning with support vector machines. 1999.

  3. Kulkarni P, Ade R. Incremental learning from unbalanced data with concept class, concept drift and missing features: a review. Int J Data Min Knowl Manag Process. 2014;4(6):15–29.

    Article  Google Scholar 

  4. Goodfellow I, Mirza M, Xiao D, Courville A, Bengio Y. An empirical investigation of catastrophic forgetting in gradient-based neural networks. In: ICLR 2014. 2014.

  5. Vijayakumar S, Schaal S. Locally weighted projection regression: an o(n) algorithm for incremental real time learning in high-dimensional spaces. In: International conference on machine learning. 2000.

  6. Nguyen-Tuong D, Peters J. Local Gaussian processes regression for real-time model-based robot control. In: IEEE/RSJ international conference on intelligent robot systems. 2008.

  7. Sigaud O, Salaun C, Padois V. On-line regression algorithms for learning mechanical models of robots: a survey. Robot Auton Syst. 2011;59(12):1115–29.

    Article  Google Scholar 

  8. Butz M, Goldberg D, Lanzi P. Computational complexity of the XCS classifier system. Found Learn Classif Syst. 2005;51:91–125.

    Article  Google Scholar 

  9. Cederborg T, Li M, Baranes A, Oudeyer P-Y. Incremental local online Gaussian mixture regression for imitation learning of multiple tasks. In: IEEE/RSJ international conference on intelligent robots and systems. 2010.

  10. Tanaka K. Inferotemporal cortex and object vision. Annu Rev Neurosci. 1996;19(1):109–39.

    Article  CAS  PubMed  Google Scholar 

  11. Leopold DA, Bondar IV, Giese MA. Norm-based face encoding by single neurons in the monkey inferotemporal cortex. Nature. 2006;442(7102):572–5.

    Article  CAS  PubMed  Google Scholar 

  12. Ross DA, Deroche M, Palmeri TJ. Not just the norm: exemplar-based models also predict face aftereffects. Psychon Bull Rev. 2014;21(1):47–70.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Erickson CA, Jagadeesh B, Desimone R. Clustering of perirhinal neurons with similar properties following visual experience in adult monkeys. Nat Neurosci. 2000;3(11):1143–8.

    Article  CAS  PubMed  Google Scholar 

  14. Polley DB, Steinberg EE, Merzenich MM. Perceptual learning directs auditory cortical map reorganization through top-down influences. J Neurosci. 2006;26(18):4970–82.

    Article  CAS  PubMed  Google Scholar 

  15. Weinberger NM. The nucleus basalis and memory codes: auditory cortical plasticity and the induction of specific, associative behavioral memory. Neurobiol Learn Mem. 2003;80(3):268–84 Acetylcholine: Cognitive and Brain Functions.

  16. Hasselmo ME. The role of acetylcholine in learning and memory. Curr Opin Neurobiol. 2006;16(6):710–5.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  17. Rolls ET, Baylis GC, Hasselmo ME, Nalwa V. The effect of learning on the face selective responses of neurons in the cortex in the superior temporal sulcus of the monkey. Exp Brain Res. 1989;76(1):153–64.

    Article  CAS  PubMed  Google Scholar 

  18. Bishop CM. Pattern recognition and machine learning. New York: Springer; 2006.

    Google Scholar 

  19. Oreilly RC. The division of labor between the neocortex and hippocampus. In: Connectionist models in cognitive psychology. 2004. p. 143–174.

  20. McClelland JL, McNaughton BL, O’Reilly RC. Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychol Rev. 1995;102:419–57.

    Article  CAS  PubMed  Google Scholar 

  21. Kohonen T. Self-organized formation of topologically correct feature maps. Biol Cybernet. 1982;43:59–69.

    Article  Google Scholar 

  22. Shen B, McNaughton BL. Modeling the spontaneous reactivation of experience-specific hippocampal cell assembles during sleep. Hippocampus. 1996;6(6):685–92.

    Article  CAS  PubMed  Google Scholar 

  23. LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. In: Intelligent signal processing, IEEE Press; 2001. p. 306–351.

  24. Gepperth A, Lefort M. Biologically inspired incremental learning for high-dimensional spaces. In: IEEE international conference on development and learning (ICDL). 2015.

  25. Vijayakumar S, Klanke S, Schaal S. A library for locally weighted projection regression. J Mach Learn Res (JMLR). 2008;9:623–6.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Gepperth.

Ethics declarations

Conflict of Interest

Alexander Gepperth and Cem Karagouz declare that they have no conflict of interest.

Informed Consent

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5). Additional informed consent was obtained from all patients for which identifying information is included in this article.

Human and Animals Rights

This article does not contain any studies with human participants or animals performed by any of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gepperth, A., Karaoguz, C. A Bio-Inspired Incremental Learning Architecture for Applied Perceptual Problems. Cogn Comput 8, 924–934 (2016). https://doi.org/10.1007/s12559-016-9389-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-016-9389-5

Keywords

Navigation