Skip to main content

Advertisement

Log in

Time-encoded multiplication-free spiking neural networks: application to data classification tasks

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Spiking neural networks (SNNs) are mimicking computationally powerful biologically inspired models in which neurons communicate through sequences of spikes, regarded here as sparse binary sequences of zeros and ones. In neuroscience it is conjectured that time encoding, where the information is carried by the temporal position of spikes, is playing a crucial role at least in some parts of the brain where estimation of the spiking rate with a large latency cannot take place. Motivated by the efficiency of temporal coding, compared with the widely used rate coding, the goal of this paper is to develop and train an energy-efficient time-coded deep spiking neural network system. To ensure that the similarity among input stimuli is translated into a correlation of the spike sequences, we introduce correlative temporal encoding and extended correlative temporal encoding techniques to map analog input information into input spike patterns. Importantly, we propose an implementation where all multiplications in the system are replaced with at most a few additions. As a more efficient alternative to both rate-coded SNNs and artificial neural networks, such system represents a preferable solution for the implementation of neuromorphic hardware. We consider data classification tasks where input spike patterns are presented to a feed-forward architecture with leaky-integrate-and-fire neurons. The SNN is trained by backpropagation through time with the objective to match sequences of output spikes with those of specifically designed target spike patterns, each corresponding to exactly one class. During inference the target spike pattern with the smallest van Rossum distance from the output spike pattern determines the class. Extensive simulations indicate that the proposed system achieves a classification accuracy at par with that of state-of-the-art machine learning models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data availability

The datasets utilized during the current study are available in public repositories.

References

  1. Dasgupta S, Stevens CF, Navlakha S (2017) A neural algorithm for a fundamental computing problem. Science 358(6364):793–796. https://doi.org/10.1126/science.aam9868

    Article  MathSciNet  MATH  Google Scholar 

  2. Eichler K et al (2017) The complete connectome of a learning and memory centre in an insect brain. Nature. https://doi.org/10.1038/nature23455

    Article  Google Scholar 

  3. Ryali C, Hopfield J, Grinberg L, Krotov D (2020) Bio-inspired hashing for unsupervised similarity search. In: International conference on machine learning, pp 8295–8306 PMLR

  4. Jankowski M, Gündüz D, Mikolajczyk K (2020) Joint device-edge inference over wireless links with pruning. In: 2020 IEEE 21st international workshop on signal processing advances in wireless communications (SPAWC), pp 1–5 IEEE

  5. Cherubini G, Jelitto J, Venkatesan V (2016) Cognitive storage for big data. Computer 49:43–51

    Article  Google Scholar 

  6. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    Article  MATH  Google Scholar 

  7. Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117

    Article  Google Scholar 

  8. Strubell E, Ganesh A, McCallum A (2019) Energy and policy considerations for deep learning in nlp. arXiv preprint arXiv:1906.02243

  9. Devlin J, Chang M-W, Lee K, Toutanova K (2018) Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805

  10. Ghosh-Dastidar S, Adeli H (2009) Third generation neural networks: Spiking neural networks. Adv Comput Intell, pp 167–178 Springer

  11. Ponulak F, Kasinski A (2011) Introduction to spiking neural networks: Information processing, learning and applications. Acta Neurobiol Exp 71(4):409–433

    MATH  Google Scholar 

  12. Zambrano D, Nusselder R, Scholte HS, Bohte S (2017) Efficient computation in adaptive artificial spiking neural networks. arXiv:1710.04838

  13. Boybat I et al. (2018) Neuromorphic computing with multi-memristive synapses. Nat Commun

  14. Sebastian A et al. (2018) Tutorial: Brain-inspired computing using phase-change memory devices. J Appl Phys 124

  15. Garain A, Basu A, Giampaolo F, Velasquez JD, Sarkar R (2021) Detection of covid-19 from ct scan images: a spiking neural network-based approach. Neural Comput Appl 33(19):12591–12604

    Article  Google Scholar 

  16. Luo Y et al (2022) Conversion of siamese networks to spiking neural networks for energy-efficient object tracking. Neural Comput Appl 34(12):9967–9982

    Article  Google Scholar 

  17. Toğaçar M, Ergen B, Cömert Z (2021) Detection of weather images by using spiking neural networks of deep learning models. Neural Comput Appl 33(11):6147–6159

    Article  Google Scholar 

  18. Liu J, Jiang D, Luo Y, Qiu S, Huang Y (2021) Minimally buffered deflection router for spiking neural network hardware implementations. Neural Comput Appl 33(18):11753–11764

    Article  Google Scholar 

  19. Hu S et al (2021) Quantized stdp-based online-learning spiking neural network. Neural Comput Appl 33(19):12317–12332

    Article  Google Scholar 

  20. Cao Z et al (2015) Spiking neural network-based target tracking control for autonomous mobile robots. Neural Comput Appl 26(8):1839–1847

    Article  Google Scholar 

  21. Jimenez-Romero C, Johnson J (2017) Spikinglab: modelling agents controlled by spiking neural networks in netlogo. Neural Comput Appl 28(1):755–764

    Article  Google Scholar 

  22. Courbariaux M, Hubara I, Soudry D, El-Yaniv R, Bengio Y (2016) Binarized neural networks: training deep neural networks with weights and activations constrained to+ 1 or \(-\)1. arXiv preprint arXiv:1602.02830

  23. Lin M et al (2020) Rotated binary neural network. Adv Neural Inf Process Syst 33:7474–7485

    Google Scholar 

  24. Liu Z, Shen Z, Savvides M, Cheng K-T (2020) Reactnet: Towards precise binary neural network with generalized activation functions. In: European conference on computer vision, pp 143–159 Springer

  25. Wang Y, Yang Y, Sun F, Yao A (2021) Sub-bit neural networks: learning to compress and accelerate binary neural networks. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 5360–5369

  26. Lin M, et al. (2021) Siman: Sign-to-magnitude network binarization. arXiv preprint arXiv:2102.07981

  27. Bulat A, Tzimiropoulos G (2019) Xnor-net++: Improved binary neural networks. arXiv preprint arXiv:1909.13863

  28. Liu Z et al. (2018) Bi-real net: Enhancing the performance of 1-bit cnns with improved representational capability and advanced training algorithm. In: Proceedings of the European conference on computer vision (ECCV), pp 722–737

  29. Xu Z, et al. (2021) Recu: Reviving the dead weights in binary neural networks. Proceedings of the IEEE/CVF international conference on computer vision, pp 5198–5208

  30. Bellec G et al (2020) A solution to the learning dilemma for recurrent networks of spiking neurons. Nat Commun 11(1):1–15

    Article  Google Scholar 

  31. Zenke F, Ganguli S (2018) Superspike: Supervised learning in multilayer spiking neural networks. Neural Comput 30(6):1514–1541

    Article  MathSciNet  MATH  Google Scholar 

  32. Comşa I-M, et al. (2021) Temporal coding in spiking neural networks with alpha synaptic function: learning with backpropagation. In: IEEE transactions on neural networks and learning systems

  33. Rueckauer B, Lungu I-A, Hu Y, Pfeiffer M, Liu S-C (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682

    Article  Google Scholar 

  34. Huh D, Sejnowski TJ (2017) Gradient descent for spiking neural networks. arXiv preprint arXiv:1706.04698

  35. Gardner B, Sporea I, Grüning A (2015) Learning spatiotemporally encoded pattern transformations in structured spiking neural networks. Neural Comput 27(12):2548–2586

    Article  MathSciNet  MATH  Google Scholar 

  36. Hunsberger E, Eliasmith C (2015) Spiking deep networks with lif neurons. arXiv preprint arXiv:1510.08829

  37. Woźniak S, Pantazi A, Bohnstingl T, Eleftheriou E (2020) Deep learning incorporating biologically inspired neural dynamics and in-memory computing. Nature Mach Intell 2(6):325–336

    Article  Google Scholar 

  38. Neftci EO, Mostafa H, Zenke F (2019) Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process Mag 36(6):51–63

    Article  Google Scholar 

  39. Bohte SM (2011) Error-backpropagation in networks of fractionally predictive spiking neurons. In: International Conference on Artificial Neural Networks, pp 60–68, Springer

  40. Hubel DH, Wiesel TN (1959) Receptive fields of single neurones in the cat’s striate cortex. J Physiol 148(3):574–591

    Article  Google Scholar 

  41. Fabre-Thorpe M, Richard G, Thorpe SJ (1998) Rapid categorization of natural images by rhesus monkeys. Neuroreport 9(2):303–308

    Article  Google Scholar 

  42. Kubke MF, Massoglia DP, Carr CE (2002) Developmental changes underlying the formation of the specialized time coding circuits in barn owls (tyto alba). J Neurosci 22(17):7671–7679

    Article  Google Scholar 

  43. Gollisch T, Meister M (2008) Rapid neural coding in the retina with relative spike latencies. Science 319(5866), 1108–1111

  44. Johansson RS, Birznieks I (2004) First spikes in ensembles of human tactile afferents code complex spatial fingertip events. Nat Neurosci 7(2):170–177

    Article  Google Scholar 

  45. Bohte S, Kok J, Poutré H (2001) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48:17–37. https://doi.org/10.1016/S0925-2312(01)00658-0

    Article  MATH  Google Scholar 

  46. Mostafa H (2018) Supervised learning based on temporal coding in spiking neural networks. IEEE Trans Neural Netw Learn Syst 29(7):3227–3235. https://doi.org/10.1109/TNNLS.2017.2726060

    Article  Google Scholar 

  47. Gerstner W, Kistler WM, Naud R, Paninski L (2014) Neuronal dynamics: from single neurons to networks and models of cognition. Cambridge University Press, Cambridge

  48. Marimuthu C, Thangaraj P, Ramesan A (2010) Low power shift and add multiplier design. arXiv preprint arXiv:1006.1179

  49. Sjöström PJ, Rancz EA, Roth A, Häusser M (2008) Dendritic excitability and synaptic plasticity. Physiol Rev

  50. Stanojevic A, Cherubini G, Moraitis T, Sebastian A (2020) File classification based on spiking neural networks. In: 2020 IEEE international symposium on circuits and systems (ISCAS), pp 1–5 IEEE

  51. van Rossum MC (2001) A novel spike distance. Neural Comput 13(4):751–763

    Article  MATH  Google Scholar 

  52. Gardner B, Sporea I, Grüning A (2015) Encoding spike patterns in multilayer spiking neural networks. arXiv preprint arXiv:1503.09129

  53. Venkatesan V, Lehinevych T, Cherubini G, Glybovets A, Lantz M (2018) Graph-based data relevance estimation for large storage systems. 2018 IEEE international congress on big data (BigData Congress), pp 232–236. IEEE

  54. Simons G, Johnson NL (1971) On the convergence of binomial to poisson distributions. Ann Math Statist 42(5):1735–1736. https://doi.org/10.1214/aoms/1177693172

    Article  MathSciNet  MATH  Google Scholar 

  55. Deng L (2012) The mnist database of handwritten digit images for machine learning research. IEEE Signal Process Mag 29(6):141–142

    Article  Google Scholar 

  56. Xiao H, Rasul K, Vollgraf R (2017) Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747

  57. Greene D, Cunningham P (2006) Practical solutions to the problem of diagonal dominance in kernel document clustering. In: Proceedings of the 23rd international conference on machine learning (ICML’06), pp 377–384 ACM Press

  58. Krizhevsky A, Hinton G, et al. (2009) Learning multiple layers of features from tiny images

  59. Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980

  60. Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  61. Goodfellow I, Bengio Y, Courville A (2016) Deep learning. MIT Press. http://www.deeplearningbook.org

  62. Manning C, Schutze H (1999) Foundations of statistical natural language processing. MIT press

  63. Qaiser S, Ali R (2018) Text mining: Use of tf-idf to examine the relevance of words to documents. Int J Comput Appl, 181. https://doi.org/10.5120/ijca2018917395

  64. Geifman YG (2018) https://github.com/geifmany/cifar-vgg

  65. Zhang A, Zhou H, Li X, Zhu W (2019) Fast and robust learning in spiking feed-forward neural networks based on intrinsic plasticity mechanism. Neurocomputing 365:102–112

    Article  Google Scholar 

  66. Zhao D, Zeng Y, Zhang T, Shi M, Zhao F.Glsnn (2020) A multi-layer spiking neural network based on global feedback alignment and local stdp plasticity. Front Comput Neurosci 14

  67. Ranjan JAK, Sigamani T, Barnabas J (2020) A novel and efficient classifier using spiking neural network. J Supercomput 76(9):6545–6560

    Article  Google Scholar 

  68. She X, Long Y, Mukhopadhyay S (2019) Fast and low-precision learning in gpu-accelerated spiking neural network. In: 2019 Design, automation and test in Europe conference and exhibition (DATE), pp 450–455 IEEE

  69. Volobuev AN, Petrov ES et al (2011) Analog-to-digital conversion of information in the retina. Nat Sci 3(01):53

    Google Scholar 

  70. Bellec G, Salaj D, Subramoney A, Legenstein R, Maass W (2018) Long short-term memory and learning-to-learn in networks of spiking neurons. Adv Neural Inf Process Syst 31

  71. Farsa EZ, Ahmadi A, Maleki MA, Gholami M, Rad HN (2019) A low-cost high-speed neuromorphic hardware based on spiking neural network. IEEE Trans Circuits Syst II Exp Briefs 66(9):1582–1586

  72. Horowitz M (2014) 1.1 computing’s energy problem (and what we can do about it). In: 2014 IEEE international solid-state circuits conference digest of technical papers (ISSCC), pp 10–14 IEEE

  73. Sze V, ju Yang T, hsin Chen Y, Emer J (2019) Efficient processing of deep neural networks: from algorithms to hardware architectures. NeurIPS, 138

  74. Bohnstingl T, Woźniak S, Pantazi A, Eleftheriou E (2022) Online spatio-temporal learning in deep neural networks. IEEE Trans Neural Netw Learn Syst

Download references

Acknowledgements

We thank Dr. Angeliki Pantazi for her insightful comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ana Stanojevic.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stanojevic, A., Cherubini, G., Woźniak, S. et al. Time-encoded multiplication-free spiking neural networks: application to data classification tasks. Neural Comput & Applic 35, 7017–7033 (2023). https://doi.org/10.1007/s00521-022-07910-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07910-1

Keywords

Navigation