Abstract
We present a biologically inspired architecture for incremental learning that remains resource-efficient even in the face of very high data dimensionalities (>1000) that are typically associated with perceptual problems. In particular, we investigate how a new perceptual (object) class can be added to a trained architecture without retraining, while avoiding the well-known catastrophic forgetting effects typically associated with such scenarios. At the heart of the presented architecture lies a generative description of the perceptual space by a self-organized approach which at the same time approximates the neighborhood relations in this space on a two-dimensional plane. This approximation, which closely imitates the topographic organization of the visual cortex, allows an efficient local update rule for incremental learning even in the face of very high dimensionalities, which we demonstrate by tests on the well-known MNIST benchmark. We complement the model by adding a biologically plausible short-term memory system, allowing it to retain excellent classification accuracy even under incremental learning in progress. The short-term memory is additionally used to reinforce new data statistics by replaying previously stored samples during dedicated “sleep” phases.
Similar content being viewed by others
References
Bordes A, Bottou L. The Huller: a simple and efficient online SVM. In: Proceedings of the 16th European conference on machine learning (ECML). 2005.
Syed A, Liu H, Sung KK. Incremental learning with support vector machines. 1999.
Kulkarni P, Ade R. Incremental learning from unbalanced data with concept class, concept drift and missing features: a review. Int J Data Min Knowl Manag Process. 2014;4(6):15–29.
Goodfellow I, Mirza M, Xiao D, Courville A, Bengio Y. An empirical investigation of catastrophic forgetting in gradient-based neural networks. In: ICLR 2014. 2014.
Vijayakumar S, Schaal S. Locally weighted projection regression: an o(n) algorithm for incremental real time learning in high-dimensional spaces. In: International conference on machine learning. 2000.
Nguyen-Tuong D, Peters J. Local Gaussian processes regression for real-time model-based robot control. In: IEEE/RSJ international conference on intelligent robot systems. 2008.
Sigaud O, Salaun C, Padois V. On-line regression algorithms for learning mechanical models of robots: a survey. Robot Auton Syst. 2011;59(12):1115–29.
Butz M, Goldberg D, Lanzi P. Computational complexity of the XCS classifier system. Found Learn Classif Syst. 2005;51:91–125.
Cederborg T, Li M, Baranes A, Oudeyer P-Y. Incremental local online Gaussian mixture regression for imitation learning of multiple tasks. In: IEEE/RSJ international conference on intelligent robots and systems. 2010.
Tanaka K. Inferotemporal cortex and object vision. Annu Rev Neurosci. 1996;19(1):109–39.
Leopold DA, Bondar IV, Giese MA. Norm-based face encoding by single neurons in the monkey inferotemporal cortex. Nature. 2006;442(7102):572–5.
Ross DA, Deroche M, Palmeri TJ. Not just the norm: exemplar-based models also predict face aftereffects. Psychon Bull Rev. 2014;21(1):47–70.
Erickson CA, Jagadeesh B, Desimone R. Clustering of perirhinal neurons with similar properties following visual experience in adult monkeys. Nat Neurosci. 2000;3(11):1143–8.
Polley DB, Steinberg EE, Merzenich MM. Perceptual learning directs auditory cortical map reorganization through top-down influences. J Neurosci. 2006;26(18):4970–82.
Weinberger NM. The nucleus basalis and memory codes: auditory cortical plasticity and the induction of specific, associative behavioral memory. Neurobiol Learn Mem. 2003;80(3):268–84 Acetylcholine: Cognitive and Brain Functions.
Hasselmo ME. The role of acetylcholine in learning and memory. Curr Opin Neurobiol. 2006;16(6):710–5.
Rolls ET, Baylis GC, Hasselmo ME, Nalwa V. The effect of learning on the face selective responses of neurons in the cortex in the superior temporal sulcus of the monkey. Exp Brain Res. 1989;76(1):153–64.
Bishop CM. Pattern recognition and machine learning. New York: Springer; 2006.
Oreilly RC. The division of labor between the neocortex and hippocampus. In: Connectionist models in cognitive psychology. 2004. p. 143–174.
McClelland JL, McNaughton BL, O’Reilly RC. Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychol Rev. 1995;102:419–57.
Kohonen T. Self-organized formation of topologically correct feature maps. Biol Cybernet. 1982;43:59–69.
Shen B, McNaughton BL. Modeling the spontaneous reactivation of experience-specific hippocampal cell assembles during sleep. Hippocampus. 1996;6(6):685–92.
LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. In: Intelligent signal processing, IEEE Press; 2001. p. 306–351.
Gepperth A, Lefort M. Biologically inspired incremental learning for high-dimensional spaces. In: IEEE international conference on development and learning (ICDL). 2015.
Vijayakumar S, Klanke S, Schaal S. A library for locally weighted projection regression. J Mach Learn Res (JMLR). 2008;9:623–6.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
Alexander Gepperth and Cem Karagouz declare that they have no conflict of interest.
Informed Consent
All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5). Additional informed consent was obtained from all patients for which identifying information is included in this article.
Human and Animals Rights
This article does not contain any studies with human participants or animals performed by any of the authors.
Rights and permissions
About this article
Cite this article
Gepperth, A., Karaoguz, C. A Bio-Inspired Incremental Learning Architecture for Applied Perceptual Problems. Cogn Comput 8, 924–934 (2016). https://doi.org/10.1007/s12559-016-9389-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-016-9389-5