Cognitive Computation

, Volume 8, Issue 5, pp 924–934 | Cite as

A Bio-Inspired Incremental Learning Architecture for Applied Perceptual Problems

Article

Abstract

We present a biologically inspired architecture for incremental learning that remains resource-efficient even in the face of very high data dimensionalities (>1000) that are typically associated with perceptual problems. In particular, we investigate how a new perceptual (object) class can be added to a trained architecture without retraining, while avoiding the well-known catastrophic forgetting effects typically associated with such scenarios. At the heart of the presented architecture lies a generative description of the perceptual space by a self-organized approach which at the same time approximates the neighborhood relations in this space on a two-dimensional plane. This approximation, which closely imitates the topographic organization of the visual cortex, allows an efficient local update rule for incremental learning even in the face of very high dimensionalities, which we demonstrate by tests on the well-known MNIST benchmark. We complement the model by adding a biologically plausible short-term memory system, allowing it to retain excellent classification accuracy even under incremental learning in progress. The short-term memory is additionally used to reinforce new data statistics by replaying previously stored samples during dedicated “sleep” phases.

Keywords

Perceptual learning Self-organization Incremental learning Biological modeling 

Notes

Compliance with Ethical Standards

Conflict of Interest

Alexander Gepperth and Cem Karagouz declare that they have no conflict of interest.

Informed Consent

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5). Additional informed consent was obtained from all patients for which identifying information is included in this article.

Human and Animals Rights

This article does not contain any studies with human participants or animals performed by any of the authors.

References

  1. 1.
    Bordes A, Bottou L. The Huller: a simple and efficient online SVM. In: Proceedings of the 16th European conference on machine learning (ECML). 2005.Google Scholar
  2. 2.
    Syed A, Liu H, Sung KK. Incremental learning with support vector machines. 1999. Google Scholar
  3. 3.
    Kulkarni P, Ade R. Incremental learning from unbalanced data with concept class, concept drift and missing features: a review. Int J Data Min Knowl Manag Process. 2014;4(6):15–29.CrossRefGoogle Scholar
  4. 4.
    Goodfellow I, Mirza M, Xiao D, Courville A, Bengio Y. An empirical investigation of catastrophic forgetting in gradient-based neural networks. In: ICLR 2014. 2014.Google Scholar
  5. 5.
    Vijayakumar S, Schaal S. Locally weighted projection regression: an o(n) algorithm for incremental real time learning in high-dimensional spaces. In: International conference on machine learning. 2000.Google Scholar
  6. 6.
    Nguyen-Tuong D, Peters J. Local Gaussian processes regression for real-time model-based robot control. In: IEEE/RSJ international conference on intelligent robot systems. 2008.Google Scholar
  7. 7.
    Sigaud O, Salaun C, Padois V. On-line regression algorithms for learning mechanical models of robots: a survey. Robot Auton Syst. 2011;59(12):1115–29.CrossRefGoogle Scholar
  8. 8.
    Butz M, Goldberg D, Lanzi P. Computational complexity of the XCS classifier system. Found Learn Classif Syst. 2005;51:91–125.CrossRefGoogle Scholar
  9. 9.
    Cederborg T, Li M, Baranes A, Oudeyer P-Y. Incremental local online Gaussian mixture regression for imitation learning of multiple tasks. In: IEEE/RSJ international conference on intelligent robots and systems. 2010.Google Scholar
  10. 10.
    Tanaka K. Inferotemporal cortex and object vision. Annu Rev Neurosci. 1996;19(1):109–39.CrossRefPubMedGoogle Scholar
  11. 11.
    Leopold DA, Bondar IV, Giese MA. Norm-based face encoding by single neurons in the monkey inferotemporal cortex. Nature. 2006;442(7102):572–5.CrossRefPubMedGoogle Scholar
  12. 12.
    Ross DA, Deroche M, Palmeri TJ. Not just the norm: exemplar-based models also predict face aftereffects. Psychon Bull Rev. 2014;21(1):47–70.CrossRefPubMedPubMedCentralGoogle Scholar
  13. 13.
    Erickson CA, Jagadeesh B, Desimone R. Clustering of perirhinal neurons with similar properties following visual experience in adult monkeys. Nat Neurosci. 2000;3(11):1143–8.CrossRefPubMedGoogle Scholar
  14. 14.
    Polley DB, Steinberg EE, Merzenich MM. Perceptual learning directs auditory cortical map reorganization through top-down influences. J Neurosci. 2006;26(18):4970–82.CrossRefPubMedGoogle Scholar
  15. 15.
    Weinberger NM. The nucleus basalis and memory codes: auditory cortical plasticity and the induction of specific, associative behavioral memory. Neurobiol Learn Mem. 2003;80(3):268–84 Acetylcholine: Cognitive and Brain Functions.Google Scholar
  16. 16.
    Hasselmo ME. The role of acetylcholine in learning and memory. Curr Opin Neurobiol. 2006;16(6):710–5.CrossRefPubMedPubMedCentralGoogle Scholar
  17. 17.
    Rolls ET, Baylis GC, Hasselmo ME, Nalwa V. The effect of learning on the face selective responses of neurons in the cortex in the superior temporal sulcus of the monkey. Exp Brain Res. 1989;76(1):153–64.CrossRefPubMedGoogle Scholar
  18. 18.
    Bishop CM. Pattern recognition and machine learning. New York: Springer; 2006.Google Scholar
  19. 19.
    Oreilly RC. The division of labor between the neocortex and hippocampus. In: Connectionist models in cognitive psychology. 2004. p. 143–174.Google Scholar
  20. 20.
    McClelland JL, McNaughton BL, O’Reilly RC. Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychol Rev. 1995;102:419–57.CrossRefPubMedGoogle Scholar
  21. 21.
    Kohonen T. Self-organized formation of topologically correct feature maps. Biol Cybernet. 1982;43:59–69.CrossRefGoogle Scholar
  22. 22.
    Shen B, McNaughton BL. Modeling the spontaneous reactivation of experience-specific hippocampal cell assembles during sleep. Hippocampus. 1996;6(6):685–92.CrossRefPubMedGoogle Scholar
  23. 23.
    LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. In: Intelligent signal processing, IEEE Press; 2001. p. 306–351.Google Scholar
  24. 24.
    Gepperth A, Lefort M. Biologically inspired incremental learning for high-dimensional spaces. In: IEEE international conference on development and learning (ICDL). 2015.Google Scholar
  25. 25.
    Vijayakumar S, Klanke S, Schaal S. A library for locally weighted projection regression. J Mach Learn Res (JMLR). 2008;9:623–6.Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.ENSTA ParisTechUIIS Lab University of Paris-SaclayPalaiseauFrance
  2. 2.INRIA FLOWERSTalenceFrance

Personalised recommendations