A Bio-Inspired Incremental Learning Architecture for Applied Perceptual Problems
- 301 Downloads
We present a biologically inspired architecture for incremental learning that remains resource-efficient even in the face of very high data dimensionalities (>1000) that are typically associated with perceptual problems. In particular, we investigate how a new perceptual (object) class can be added to a trained architecture without retraining, while avoiding the well-known catastrophic forgetting effects typically associated with such scenarios. At the heart of the presented architecture lies a generative description of the perceptual space by a self-organized approach which at the same time approximates the neighborhood relations in this space on a two-dimensional plane. This approximation, which closely imitates the topographic organization of the visual cortex, allows an efficient local update rule for incremental learning even in the face of very high dimensionalities, which we demonstrate by tests on the well-known MNIST benchmark. We complement the model by adding a biologically plausible short-term memory system, allowing it to retain excellent classification accuracy even under incremental learning in progress. The short-term memory is additionally used to reinforce new data statistics by replaying previously stored samples during dedicated “sleep” phases.
KeywordsPerceptual learning Self-organization Incremental learning Biological modeling
Compliance with Ethical Standards
Conflict of Interest
Alexander Gepperth and Cem Karagouz declare that they have no conflict of interest.
All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2008 (5). Additional informed consent was obtained from all patients for which identifying information is included in this article.
Human and Animals Rights
This article does not contain any studies with human participants or animals performed by any of the authors.
- 1.Bordes A, Bottou L. The Huller: a simple and efficient online SVM. In: Proceedings of the 16th European conference on machine learning (ECML). 2005.Google Scholar
- 2.Syed A, Liu H, Sung KK. Incremental learning with support vector machines. 1999. Google Scholar
- 4.Goodfellow I, Mirza M, Xiao D, Courville A, Bengio Y. An empirical investigation of catastrophic forgetting in gradient-based neural networks. In: ICLR 2014. 2014.Google Scholar
- 5.Vijayakumar S, Schaal S. Locally weighted projection regression: an o(n) algorithm for incremental real time learning in high-dimensional spaces. In: International conference on machine learning. 2000.Google Scholar
- 6.Nguyen-Tuong D, Peters J. Local Gaussian processes regression for real-time model-based robot control. In: IEEE/RSJ international conference on intelligent robot systems. 2008.Google Scholar
- 9.Cederborg T, Li M, Baranes A, Oudeyer P-Y. Incremental local online Gaussian mixture regression for imitation learning of multiple tasks. In: IEEE/RSJ international conference on intelligent robots and systems. 2010.Google Scholar
- 15.Weinberger NM. The nucleus basalis and memory codes: auditory cortical plasticity and the induction of specific, associative behavioral memory. Neurobiol Learn Mem. 2003;80(3):268–84 Acetylcholine: Cognitive and Brain Functions.Google Scholar
- 18.Bishop CM. Pattern recognition and machine learning. New York: Springer; 2006.Google Scholar
- 19.Oreilly RC. The division of labor between the neocortex and hippocampus. In: Connectionist models in cognitive psychology. 2004. p. 143–174.Google Scholar
- 23.LeCun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. In: Intelligent signal processing, IEEE Press; 2001. p. 306–351.Google Scholar
- 24.Gepperth A, Lefort M. Biologically inspired incremental learning for high-dimensional spaces. In: IEEE international conference on development and learning (ICDL). 2015.Google Scholar
- 25.Vijayakumar S, Klanke S, Schaal S. A library for locally weighted projection regression. J Mach Learn Res (JMLR). 2008;9:623–6.Google Scholar