Skip to main content

Advertisement

Log in

CIRM-SNN: Certainty Interval Reset Mechanism Spiking Neuron for Enabling High Accuracy Spiking Neural Network

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Spiking neural network (SNN) based on sparse trigger and event-driven information processing has the advantages of ultra-low power consumption and hardware friendliness. As a new generation of neural networks, SNN is widely concerned. At present, the most effective way to realize deep SNN is through artificial neural network (ANN) conversion. Compared with the original ANN, the converted SNN suffers from performance loss. This paper adjusts the spike firing rate of spiking neurons to minimize the performance loss of SNN in the conversion process. We map the ANN weights to the corresponding SNN after continuous normalization, which ensures that the spike firing rate of the neuron is in the normal range. We propose a certainty interval reset mechanism (CIRM), which effectively reduces the loss of membrane potential and avoids the problem of neuronal over-activation. In the experiment, we added a modulation factor to the CIRM to further adjust the spike firing rate of neurons. The accuracy of the converted SNN on CIFAR-10 is 1.026% higher than that of the original ANN. The algorithm not only achieves the lossless conversion of ANN, but also reduces the network energy consumption. Our algorithm also effectively improves the accuracy of SNN (VGG-15) on CIFAR-100 and decreases the network delay. The work of this paper is of great significance for developing high-precision depth SNN.

This is a preview of subscription content, log in via an institution to check access.

Access this article

We’re sorry, something doesn't seem to be working properly.

Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Lee JG et al (2017) Deep learning in medical imaging: general overview. Korean J Radiol 18(4):570–584

    Article  Google Scholar 

  2. Beauchemin SS, Bauer MA, Kowsari T, Ji C (2012) Portable and scalable vision-based vehicular instrumentation for the analysis of driver intentionality. IEEE Trans Instrum Meas 61(2):391–401

    Article  Google Scholar 

  3. Pouladzadeh P, Shirmohammadi S, Al-Maghrabi R (2014) Measuring calorie and nutrition from food image. IEEE Trans Instrum Meas 63(8):1947–1956

    Article  Google Scholar 

  4. Shirmohammadi S, Ferrero A (2014) Camera as the instrument: the rising trend of vision based measurement. IEEE Instrum Meas Mag 17(3):41–47

    Article  Google Scholar 

  5. Wu J, Chua Y, Zhang M, Yang Q, Li G, Li H (2019) Deep spiking neural network with spike count based learning rule. IEEE

  6. Hunsberger E, Eliasmith C (2015) Spiking deep networks with LIF neurons. Comput Sci. https://doi.org/10.48550/arXiv.1510.08829

  7. Hu Z, Wang T, Hu X (2017) An STDP-based supervised learning algorithm for spiking neural networks. In: Neural information processing. Lecture notes in computer science, pp 92–100

  8. Mostafa H (2018) Supervised learning based on temporal coding in spiking neural networks. IEEE Trans Neural Netw Learn Syst 29(7):3227–3235

    Google Scholar 

  9. Neftci EO, Augustine C, Paul S, Detorakis G (2017) Event-driven random back-propagation: enabling neuromorphic deep learning machines. Front Neurosci 11:324

    Article  Google Scholar 

  10. Li J, Hu W, Yuan Y, Huo H, Fang T (2017) Bio-inspired deep spiking neural network for image classification. In: Neural information processing. Lecture notes in computer science, pp 294–304

  11. Esser SK et al (2016) Convolutional networks for fast, energy-efficient neuromorphic computing. Proc Natl Acad Sci U S A 113(41):11441–11446

    Article  Google Scholar 

  12. Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T (2018) STDP-based spiking deep convolutional neural networks for object recognition. Neural Netw 99:56–67

    Article  Google Scholar 

  13. Natschläger B, Ruf B, Schmitt M (2002) Unsupervised learning and self-organization in networks of spiking neurons. In: Seiffert U, Jain LC (eds) Self-organizing neural networks: recent advances and applications. Physica, Heidelberg, pp 45–73

  14. Zhang M et al (2020) An efficient threshold-driven aggregate-label learning algorithm for multimodal information processing. IEEE J Sel Top Signal Process 14:592

    Article  Google Scholar 

  15. Wang J et al (2022) Alloy electrode engineering in memristors for emulating the biological synapse. Nanoscale 14(4):1318

    Article  Google Scholar 

  16. Song S, Miller KD, Abbott LF (2000) Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nat Neurosci 3(9):919–926

    Article  Google Scholar 

  17. Thiele JC, Bichler O, Dupret A (2018) A timescale invariant STDP-based spiking deep network for unsupervised online feature extraction from event-based sensor data. In: 2018 international joint conference on neural networks (IJCNN)

  18. Yu M, Tang H, Gang P (2018) A supervised multi-spike learning algorithm for spiking neural networks. In: 2018 international joint conference on neural networks (IJCNN)

  19. Legenstein R, Pecevski D, Maass W (2008) A learning theory for reward-modulated spike-timing-dependent plasticity with application to biofeedback. PLoS Comput Biol 4(10):e1000180

    Article  MathSciNet  Google Scholar 

  20. Legenstein R, Chase SM, Schwartz AB, Maass W (2010) A reward-modulated hebbian learning rule can explain experimentally observed network reorganization in a brain control task. J Neurosci 30(25):8400–8410

    Article  Google Scholar 

  21. Liu J, Zhao G (2018) A bio-inspired SOSNN model for object recognition. In: 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, pp 1–8. https://doi.org/10.1109/IJCNN.2018.8489076

  22. Mozafari M, Ganjtabesh M, Nowzari-Dalini A, Thorpe SJ, Masquelier T (2019) Bio-inspired digit recognition using reward-modulated spike-timing-dependent plasticity in deep convolutional networks. Pattern Recogn 94:87–95

    Article  Google Scholar 

  23. Masquelier T, Thorpe SJ (2007) Unsupervised learning of visual features through spike timing dependent plasticity. PLoS Comput Biol 3:e31

    Article  Google Scholar 

  24. Neftci EO, Mostafa H, Zenke F (2019) Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process Mag 36(6):51–63

    Article  Google Scholar 

  25. Zhang M et al (2021) Rectified linear postsynaptic potential function for backpropagation in deep spiking neural networks. IEEE Trans Neural Netw Learn Syst 33(5):1947–1958

    Article  Google Scholar 

  26. Panda P, Roy K (2016) Unsupervised regenerative learning of hierarchical features in spiking deep networks for object recognition. In: 2016 international joint conference on neural networks (IJCNN). IEEE

  27. Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508

    Article  Google Scholar 

  28. Zenke F, Ganguli S (2018) Superspike: supervised learning in multilayer spiking neural networks. Neural Comput 30(6):1514–1541

    Article  MathSciNet  MATH  Google Scholar 

  29. Wu Y et al (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331

    Article  Google Scholar 

  30. Zheng H et al (2021) Going deeper with directly-trained larger spiking neural networks. In: Proceedings of the AAAI conference on artificial intelligence, vol 35, no 12

  31. Wu Y et al (2019) Direct training for spiking neural networks: faster, larger, better. In: Proceedings of the AAAI conference on artificial intelligence, vol 33, no 01

  32. Cao Y, Chen Y, Khosla D (2014) Spiking deep convolutional neural networks for energy-efficient object recognition. Int J Comput Vis 113(1):54–66

    Article  MathSciNet  Google Scholar 

  33. Han B, Srinivasan G, Roy K (2020) RMP-SNN: residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. IEEE

  34. Neil D, Liu S-C (2016) Effective sensor fusion with event-based sensors and deep network architectures. In: Presented at the 2016 IEEE international symposium on circuits and systems (ISCAS)

  35. Neil D, Pfeiffer M, Liu S-C (2016) Learning to be efficient. In: Presented at the proceedings of the 31st annual ACM symposium on applied computing

  36. O’Connor P, Neil D, Liu SC, Delbruck T, Pfeiffer M (2013) Real-time classification and sensor fusion with a spiking deep belief network. Front Neurosci 7:178

    Article  Google Scholar 

  37. Panchev C, Wermter S (2004) Spike-timing-dependent synaptic plasticity: from single spikes to spike trains. Neurocomputing 58–60:365–371

    Article  Google Scholar 

  38. Neil D, Liu S-C (2014) Minitaur, an event-driven FPGA-based spiking network accelerator. IEEE Trans Very Large Scale Integr (VLSI) Syst 22(12):2621–2628

    Article  Google Scholar 

  39. Zhang L, Zhou S, Zhi T, Du Z, Chen Y (2019) TDSNN: from deep neural networks to deep spike neural networks with temporal-coding. Proc AAAI Conf Artif Intell 33:1319–1326

    Google Scholar 

  40. Tan W, Patel D, Kozma R (2021) Strategy and benchmark for converting deep Q-networks to event-driven spiking neural networks. Proc AAAI Conf Artif Intell 35:9816–9824

    Google Scholar 

  41. Xiao R, Yu Q, Yan R, Tang H (2019) Fast and accurate classification with a multi-spike learning algorithm for spiking neurons. In: Twenty-eighth international joint conference on artificial intelligence {IJCAI-19}

  42. Zhang D et al (2020) Global enhancement of cortical excitability following coactivation of large neuronal populations. Proc Natl Acad Sci 117(33):20254

    Article  Google Scholar 

  43. Diehl PU, Neil D, Binas J, Cook M, Liu SC (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: International joint conference on neural networks

  44. Fukushima K (1980) Neocognitron: a self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol Cybern 36(4):193–202

    Article  MATH  Google Scholar 

  45. Tavanaei A, Maida AS (2017) Multi-layer unsupervised learning in a spiking convolutional neural network. In: International joint conference on neural networks

  46. Sengupta A, Ye Y, Wang R, Liu C, Roy K (2019) Going deeper in spiking neural networks: VGG and residual architectures. Front Neurosci 13:95

    Article  Google Scholar 

  47. Rueckauer B, Lungu IA, Hu Y, Pfeiffer M, Liu SC (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682

    Article  Google Scholar 

  48. Chen Y et al (2022) An adaptive threshold mechanism for accurate and efficient deep spiking convolutional neural networks. Neurocomputing 469:189–197

    Article  Google Scholar 

  49. Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci. https://doi.org/10.3389/fnins.2016.00508

    Article  Google Scholar 

  50. Diehl PU, Cook M (2015) Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neurosci 9:99

    Article  Google Scholar 

  51. Cohen GK, Orchard G, Leng S-H, Tapson J, Benosman RB, André V (2016) Skimming digits: neuromorphic classification of spike-encoded images. Front Neurosci. https://doi.org/10.3389/fnins.2016.00184

    Article  Google Scholar 

  52. Neftci E, Das S, Pedroni B, Kreutz-Delgado K, Cauwenberghs G (2013) Event-driven contrastive divergence for spiking neuromorphic systems. Front Neurosci 7(8):272

    Google Scholar 

  53. Lu S, Sengupta A (2020) Exploring the connection between binary and spiking neural networks. Front Neurosci 14:535

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the National Nature Science Foundation of China (Grant No.61871106) and the Open Project Program Foundation of the Key Laboratory of Opto-Electronics Information Processing, Chinese Academy of Sciences (OEIP-O-202002).

Author information

Authors and Affiliations

Authors

Contributions

Li-Ye Niu wrote the main manuscript text and prepared figures. Ying Wei directed the writing and chapter arrangement of the manuscript. All authors reviewed the manuscript.

Corresponding author

Correspondence to Ying Wei.

Ethics declarations

Competing interests

The authors declare no competing interests.

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 64 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Niu, LY., Wei, Y. CIRM-SNN: Certainty Interval Reset Mechanism Spiking Neuron for Enabling High Accuracy Spiking Neural Network. Neural Process Lett 55, 7561–7582 (2023). https://doi.org/10.1007/s11063-023-11274-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-023-11274-5

Keywords

Navigation