Synaptogenesis: Constraining Synaptic Plasticity Based on a Distance Rule
Neural models, artificial or biologically grounded, have been used for understanding the nature of learning mechanisms as well as for applied tasks. The study of such learning systems has been typically centered on the identification or extraction of the most relevant features that will help to solve a task. Recently, convolutional networks, deep architectures and huge reservoirs have shown impressive results in tasks ranging from speech recognition to visual classification or emotion perception. With the accumulated momentum of such large-scale architectures, the importance of imposing sparsity on the networks to differentiate contexts has been rising. We present a biologically grounded system that imposes physical and local constraints to these architectures in the form of synaptogenesis, or synapse generation. This method guarantees sparsity and promotes the acquisition of experience-relevant, topologically-organized and more diverse features.
KeywordsMachine learning Connections Biologically constrained
- 1.Schrauwen, B., Verstraeten, D., Van Campenhout, J.: An overview of reservoir computing: theory, applications and implementations. In: 15th European Symposium on Artificial Neural Networks, pp. 471–482 (2007)Google Scholar
- 5.Kuhn, H.G., Dickinson-Anson, H., Gage, F.H.: Neurogenesis in the dentate gyrus of the adult rat: age-related decrease of neuronal progenitor proliferation. J. Neurosci. 16(6), 2027–2033 (1996)Google Scholar
- 7.Wang, K., Shamma, S.A., Byrne, W.J.: Noise robustness in the auditory representation of speech signals. In: IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 1993, vol. 2, pp. 335–338. IEEE (1993)Google Scholar
- 10.Abbott, L., Gerstner, W.: Homeostasis and learning through spike-timing dependent plasticity. In: Summer School in Neurophzsics, no. LCN-PRESENTATION-2007-001 (2003)Google Scholar