Skip to main content
Log in

Spike time displacement-based error backpropagation in convolutional spiking neural networks

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, we introduce a supervised learning algorithm, which avoids backward recursive gradient computation, for training deep convolutional spiking neural networks (SNNs) with single-spike-based temporal coding. The algorithm employs a linear approximation to compute the derivative of the spike latency with respect to the membrane potential, and it uses spiking neurons with piecewise linear postsynaptic potential to reduce the computational cost and the complexity of neural processing. To evaluate the performance of the proposed algorithm in deep architectures, we employ it in convolutional SNNs for the image classification task. For two popular benchmarks of MNIST and Fashion-MNIST datasets, the network reaches accuracies of, respectively, 99.2 and \(92.8\%\). The trade-off between memory storage capacity and computational cost with accuracy is analyzed by applying two sets of weights: real-valued weights that are updated in the backward pass and their signs, binary weights, that are employed in the feedforward process. We evaluate the binary CSNN on two datasets of MNIST and Fashion-MNIST and obtain acceptable performance with a negligible accuracy drop with respect to real-valued weights (about 0.6 and \(0.8\%\) drops, respectively).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Data Availability

The codes generated during the current study are available in the mmirsadeghi repository, at https://github.com/mmirsadeghi/StiDi-BP.

References

  1. Kheradpisheh SR, Ganjtabesh M, Masquelier T (2016) Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition. Neurocomputing 205:382–392

    Article  Google Scholar 

  2. Mozafari M, Kheradpisheh SR, Masquelier T, Nowzari-Dalini A, Ganjtabesh M (2018) First-spike-based visual categorization using reward-modulated STDP. IEEE Trans Neural Netw Learn Syst 29:6178–6190

    Article  Google Scholar 

  3. Kheradpisheh SR, Masquelier T (2018) Optimal localist and distributed coding of spatiotemporal spike patterns through STDP and coincidence detection. Front Comput Neurosci. https://doi.org/10.3389/fncom.2018.00074

    Article  Google Scholar 

  4. Bohte S, Kok J, Poutre H (2002) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48:17–37

    Article  MATH  Google Scholar 

  5. Mostafa H (2017) Supervised learning based on temporal coding in spiking neural networks. IEEE Trans Neural Netw Learn Syst 29:3227–3235

    Google Scholar 

  6. Comşa IM, Potempa K, Versari L, Fischbacher T, Gesmundo A, Alakuijala J (2021) Temporal coding in spiking neural networks with alpha synaptic function. IEEE Trans Neural Netw Learn 35:1–14

    Google Scholar 

  7. Zhou SH, Li X, Chen Y, Chandrasekaran ST, Sanyal A (2021) Temporal-coded deep spiking neural network with easy training and robust performance, In: Proceedings of the AAAI conference on artificial intelligence, pp 11143-11151

  8. Liu SH, Deng W (2015) Very deep convolutional neural network based image classification using small training sample size, In: 2015 3rd IAPR Asian conference on pattern recognition (ACPR), pp 730-734

  9. Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition, In: conference paper at ICLR, pp 607–617

  10. Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions, pp 1-9

  11. Kheradpisheh SR, Masquelier T (2020) Temporal backpropagation for spiking neural networks with one spike per neuron. Int J Neural Syst 30(06):2050027

    Article  Google Scholar 

  12. Zhang M, Wang J, Amornpaisannon B, Zhang Z, Miriyala V, Belatreche A, Qu H, Wu J, Chua Y, Carlson E, Li H (2021) Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE Trans Neural Netw Learn Syst 33(5):1947–1958

    Article  Google Scholar 

  13. Kundu S, Datta G, Pedram M, Beerel PA (2021) Towards energy-efficient deep spiking neural networks by limiting spiking activity via attention-guided compression, In: Proceedings of the IEEE/CVF winter conference on applications of computer vision (WACV), pp 3953-3962

  14. Muramatsu N, Yu H (2021) Combining spiking neural network and artificial neural network for enhanced image classification, arXiv preprint arXiv:2102.10592,

  15. Sengupta A, Ye Y, Wang R, Liu C, Roy K (2019) Going deeper in spiking neural networks: VGG and residual architectures. Front Neurosci 13:95

    Article  Google Scholar 

  16. Rueckauer B, Liui SH (2018) Conversion of analog to spiking neural networks using sparse temporal coding, In: 2018 IEEE international symposium on circuits and systems (ISCAS), pp 1-5

  17. Zhang W, Li P (2020) Temporal spike sequence learning via backpropagation for deep spiking neural networks, In: 34th conference on neural information processing systems (NeurIPS)

  18. Lee C, Sarwar SS, Panda P, Srinivasan G, Roy K (2020) Enabling spike-based backpropagation for training deep neural network architectures. Front Neurosci 14:119

    Article  Google Scholar 

  19. Fang W, Yu Zh, Chen Y, Masquelier T, Huang T, Tian Y (2021) Incorporating learnable membrane time constant to enhance learning of spiking neural networks, In: Proceedings of the IEEE/CVF international conference on computer vision, pp 2661-2671

  20. Mirsadeghi M, Shalchian M, Kheradpisheh SR, Masquelier T (2021) STiDi-BP: Spike time displacement based error backpropagation in multilayer spiking neural networks. Neurocomputing 427:131–140

    Article  Google Scholar 

  21. Esser S, Appuswamy R, Merolla P, Arthur J, Modha D (2015) Backpropagation for energy-efficient neuromorphic computing, Curran Associates, Inc.28

  22. Esser S, Merolla P, Arthur J, Cassidy A, Appuswamy R, Andreopoulos A, Berg D, McKinstry J, Melano T, Barch D, di Nolfo C, Datta P, Amir A, Taba B, Flickner M, Modha D (2016) Convolutional networks for fast, energy-efficient neuromorphic computing. Proc Natl Acad Sci USA 113(41):11441–11446

    Article  Google Scholar 

  23. Rueckauer B, Lungu IA, Hu Y, Pfeiffer M, Liu SC (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682

    Article  Google Scholar 

  24. Wang Y, Xu Y, Yan R, Tang H (2020) Deep spiking neural networks with binary weights for object recognition. IEEE Trans Cogn Develop Syst 13(3):514–23

    Article  Google Scholar 

  25. Kheradpisheh SR, Mirsadeghi M, Masquelier T (2022) BS4NN: Binarized spiking neural networks with temporal coding and learning. Neural Process Lett 54(2):1255–1273

    Article  Google Scholar 

  26. Lecun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

    Article  Google Scholar 

  27. Xiao Han, Rasul Kashif, Vollgraf Roland (2017) Fashion-MNIST: a Novel image dataset for benchmarking machine learning algorithms, arXiv preprint arXiv:1708.07747. Comment: Dataset is freely available at https://github.com/zalandoresearch/fashion-mnist Benchmark is available at http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/

  28. Neftci E, Augustine C, Paul S, Detorakis G (2017) Event-driven random back-propagation: Enabling neuromorphic deep learning machines. Front Neurosci 11:324

    Article  Google Scholar 

  29. Lee J, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508

    Article  Google Scholar 

  30. Bohte S (2011) Error-backpropagation in networks of fractionally predictive spiking neurons, In: International conference on artificial neural networks, pp 60–68

  31. Wu Y, Deng L, Li G, Zhu J, Shi L (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331

    Article  Google Scholar 

  32. Zenke F, Ganguli S (2018) Superspike: Supervised learning in multilayer spiking neural networks. Neural Comput 30(6):1514–1541

    Article  MathSciNet  MATH  Google Scholar 

  33. Shrestha SB, Orchard G (2018) SLAYER: Spike layer error reassignment in time. Adv Neural Inf Process Syst 13:1412–1421

    Google Scholar 

  34. Huh D, Sejnowski TJ (2018) Gradient descent for spiking neural networks. Adv Neural Inf Process Syst 13:1433–1443

    Google Scholar 

  35. Laydevant J, Ernoult M, Querlioz D, Grollier J (2021) Training dynamical binary neural networks with equilibrium propagation, In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 4640-4649

  36. Cho J, Jung Y, Lee S, Jung Y (2021) Reconfigurable binary neural network accelerator with adaptive parallelism scheme. Electronics 10(3):230

    Article  Google Scholar 

  37. Kheradpisheh SR, Mirsadeghi M, Masquelier T (2022) Spiking neural networks trained via proxy. IEEE Access 10:70769–78

    Article  Google Scholar 

  38. Deng S, Gu S (2021) Optimal conversion of conventional artificial neural networks to spiking neural networks, In: International conference on learning representations

  39. Rathi N, Srinivasan G, Panda P, Roy K (2020) Enabling deep spiking neural networks with hybrid conversion and spike timing dependent backpropagation, arXiv preprint arXiv:2005.01807

  40. Nomura O, Sakemi Y, Hosomi T, Morie T (2022) Robustness of spiking neural networks based on time-to-first-spike encoding against adversarial attacks. IEEE Trans Circ Syst-II: Express Briefs 69(9):3640–4

    Google Scholar 

  41. Jolliffe IT, Cadima J (2016) Principal component analysis: a review and recent developments. The Royal Society Publishing, UK

    MATH  Google Scholar 

  42. Roweis S, Hinton G (2002) Stochastic neighbor embedding, Neural Inf Process Syst

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Majid Shalchian.

Ethics declarations

Conflicts of interest

The authors of this paper have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mirsadeghi, M., Shalchian, M., Kheradpisheh, S.R. et al. Spike time displacement-based error backpropagation in convolutional spiking neural networks. Neural Comput & Applic 35, 15891–15906 (2023). https://doi.org/10.1007/s00521-023-08567-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-023-08567-0

Keywords

Navigation