Abstract
In this paper, we introduce a supervised learning algorithm, which avoids backward recursive gradient computation, for training deep convolutional spiking neural networks (SNNs) with single-spike-based temporal coding. The algorithm employs a linear approximation to compute the derivative of the spike latency with respect to the membrane potential, and it uses spiking neurons with piecewise linear postsynaptic potential to reduce the computational cost and the complexity of neural processing. To evaluate the performance of the proposed algorithm in deep architectures, we employ it in convolutional SNNs for the image classification task. For two popular benchmarks of MNIST and Fashion-MNIST datasets, the network reaches accuracies of, respectively, 99.2 and \(92.8\%\). The trade-off between memory storage capacity and computational cost with accuracy is analyzed by applying two sets of weights: real-valued weights that are updated in the backward pass and their signs, binary weights, that are employed in the feedforward process. We evaluate the binary CSNN on two datasets of MNIST and Fashion-MNIST and obtain acceptable performance with a negligible accuracy drop with respect to real-valued weights (about 0.6 and \(0.8\%\) drops, respectively).
Similar content being viewed by others
Data Availability
The codes generated during the current study are available in the mmirsadeghi repository, at https://github.com/mmirsadeghi/StiDi-BP.
References
Kheradpisheh SR, Ganjtabesh M, Masquelier T (2016) Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition. Neurocomputing 205:382–392
Mozafari M, Kheradpisheh SR, Masquelier T, Nowzari-Dalini A, Ganjtabesh M (2018) First-spike-based visual categorization using reward-modulated STDP. IEEE Trans Neural Netw Learn Syst 29:6178–6190
Kheradpisheh SR, Masquelier T (2018) Optimal localist and distributed coding of spatiotemporal spike patterns through STDP and coincidence detection. Front Comput Neurosci. https://doi.org/10.3389/fncom.2018.00074
Bohte S, Kok J, Poutre H (2002) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48:17–37
Mostafa H (2017) Supervised learning based on temporal coding in spiking neural networks. IEEE Trans Neural Netw Learn Syst 29:3227–3235
Comşa IM, Potempa K, Versari L, Fischbacher T, Gesmundo A, Alakuijala J (2021) Temporal coding in spiking neural networks with alpha synaptic function. IEEE Trans Neural Netw Learn 35:1–14
Zhou SH, Li X, Chen Y, Chandrasekaran ST, Sanyal A (2021) Temporal-coded deep spiking neural network with easy training and robust performance, In: Proceedings of the AAAI conference on artificial intelligence, pp 11143-11151
Liu SH, Deng W (2015) Very deep convolutional neural network based image classification using small training sample size, In: 2015 3rd IAPR Asian conference on pattern recognition (ACPR), pp 730-734
Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition, In: conference paper at ICLR, pp 607–617
Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions, pp 1-9
Kheradpisheh SR, Masquelier T (2020) Temporal backpropagation for spiking neural networks with one spike per neuron. Int J Neural Syst 30(06):2050027
Zhang M, Wang J, Amornpaisannon B, Zhang Z, Miriyala V, Belatreche A, Qu H, Wu J, Chua Y, Carlson E, Li H (2021) Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE Trans Neural Netw Learn Syst 33(5):1947–1958
Kundu S, Datta G, Pedram M, Beerel PA (2021) Towards energy-efficient deep spiking neural networks by limiting spiking activity via attention-guided compression, In: Proceedings of the IEEE/CVF winter conference on applications of computer vision (WACV), pp 3953-3962
Muramatsu N, Yu H (2021) Combining spiking neural network and artificial neural network for enhanced image classification, arXiv preprint arXiv:2102.10592,
Sengupta A, Ye Y, Wang R, Liu C, Roy K (2019) Going deeper in spiking neural networks: VGG and residual architectures. Front Neurosci 13:95
Rueckauer B, Liui SH (2018) Conversion of analog to spiking neural networks using sparse temporal coding, In: 2018 IEEE international symposium on circuits and systems (ISCAS), pp 1-5
Zhang W, Li P (2020) Temporal spike sequence learning via backpropagation for deep spiking neural networks, In: 34th conference on neural information processing systems (NeurIPS)
Lee C, Sarwar SS, Panda P, Srinivasan G, Roy K (2020) Enabling spike-based backpropagation for training deep neural network architectures. Front Neurosci 14:119
Fang W, Yu Zh, Chen Y, Masquelier T, Huang T, Tian Y (2021) Incorporating learnable membrane time constant to enhance learning of spiking neural networks, In: Proceedings of the IEEE/CVF international conference on computer vision, pp 2661-2671
Mirsadeghi M, Shalchian M, Kheradpisheh SR, Masquelier T (2021) STiDi-BP: Spike time displacement based error backpropagation in multilayer spiking neural networks. Neurocomputing 427:131–140
Esser S, Appuswamy R, Merolla P, Arthur J, Modha D (2015) Backpropagation for energy-efficient neuromorphic computing, Curran Associates, Inc.28
Esser S, Merolla P, Arthur J, Cassidy A, Appuswamy R, Andreopoulos A, Berg D, McKinstry J, Melano T, Barch D, di Nolfo C, Datta P, Amir A, Taba B, Flickner M, Modha D (2016) Convolutional networks for fast, energy-efficient neuromorphic computing. Proc Natl Acad Sci USA 113(41):11441–11446
Rueckauer B, Lungu IA, Hu Y, Pfeiffer M, Liu SC (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682
Wang Y, Xu Y, Yan R, Tang H (2020) Deep spiking neural networks with binary weights for object recognition. IEEE Trans Cogn Develop Syst 13(3):514–23
Kheradpisheh SR, Mirsadeghi M, Masquelier T (2022) BS4NN: Binarized spiking neural networks with temporal coding and learning. Neural Process Lett 54(2):1255–1273
Lecun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324
Xiao Han, Rasul Kashif, Vollgraf Roland (2017) Fashion-MNIST: a Novel image dataset for benchmarking machine learning algorithms, arXiv preprint arXiv:1708.07747. Comment: Dataset is freely available at https://github.com/zalandoresearch/fashion-mnist Benchmark is available at http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/
Neftci E, Augustine C, Paul S, Detorakis G (2017) Event-driven random back-propagation: Enabling neuromorphic deep learning machines. Front Neurosci 11:324
Lee J, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508
Bohte S (2011) Error-backpropagation in networks of fractionally predictive spiking neurons, In: International conference on artificial neural networks, pp 60–68
Wu Y, Deng L, Li G, Zhu J, Shi L (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331
Zenke F, Ganguli S (2018) Superspike: Supervised learning in multilayer spiking neural networks. Neural Comput 30(6):1514–1541
Shrestha SB, Orchard G (2018) SLAYER: Spike layer error reassignment in time. Adv Neural Inf Process Syst 13:1412–1421
Huh D, Sejnowski TJ (2018) Gradient descent for spiking neural networks. Adv Neural Inf Process Syst 13:1433–1443
Laydevant J, Ernoult M, Querlioz D, Grollier J (2021) Training dynamical binary neural networks with equilibrium propagation, In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 4640-4649
Cho J, Jung Y, Lee S, Jung Y (2021) Reconfigurable binary neural network accelerator with adaptive parallelism scheme. Electronics 10(3):230
Kheradpisheh SR, Mirsadeghi M, Masquelier T (2022) Spiking neural networks trained via proxy. IEEE Access 10:70769–78
Deng S, Gu S (2021) Optimal conversion of conventional artificial neural networks to spiking neural networks, In: International conference on learning representations
Rathi N, Srinivasan G, Panda P, Roy K (2020) Enabling deep spiking neural networks with hybrid conversion and spike timing dependent backpropagation, arXiv preprint arXiv:2005.01807
Nomura O, Sakemi Y, Hosomi T, Morie T (2022) Robustness of spiking neural networks based on time-to-first-spike encoding against adversarial attacks. IEEE Trans Circ Syst-II: Express Briefs 69(9):3640–4
Jolliffe IT, Cadima J (2016) Principal component analysis: a review and recent developments. The Royal Society Publishing, UK
Roweis S, Hinton G (2002) Stochastic neighbor embedding, Neural Inf Process Syst
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
The authors of this paper have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Mirsadeghi, M., Shalchian, M., Kheradpisheh, S.R. et al. Spike time displacement-based error backpropagation in convolutional spiking neural networks. Neural Comput & Applic 35, 15891–15906 (2023). https://doi.org/10.1007/s00521-023-08567-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-023-08567-0