Skip to main content

Advertisement

Log in

A Novel Unsupervised Spatial–Temporal Learning Mechanism in a Bio-inspired Spiking Neural Network

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Bio-inspired computing is a powerful platform that develops intelligent machines based on principles of the behavioral and functional mechanisms of the human nervous system. Such machines can be critical tools in expert systems, speech recognition, pattern recognition, and machine vision. In this study, a retinal model is used as input layer of spiking network to convert image pixels to spike trains. The produced spikes are injected into a spiking neural network (SNN) as a second layer, which structure and functioning is inspired by real neuronal networks (i.e. excitatory and inhibitory neurotransmitters as AMPA and GABA currents and spiking neurons). Similarly, an unsupervised, spatial–temporal, and sparse spike-based learning mechanism based on learning processes in the brain was developed to train the spiking neurons in the output layer for recognizing patterns of MNIST and EMNIST datasets with very high accuracy (above 97%) and CIFAR10 with accuracy 92.9%. The proposed spiking pattern recognition network has higher classification accuracy than previous deep spiking networks and has advantages such as higher convergence speed, unsupervised learning, fewer numbers of hyper-parameters and network layers, and ability to learn with the limited number of training data. Finally, by changing the size and stride of the averaging windows in the visual pathway, we can train the network with only 10% of the training datasets, achieving accuracy similar or higher than state-of-the-art deep learning approaches. The ability to achieve high-performance accuracy in pattern recognition networks despite the limited number of training data is one of the most important challenges of neural networks in artificial intelligence. In summary, the novel bio-inspired neuronal network utilizes spiking trains and learns unsupervised and is capable of recognizing complex patterns, similar in performance to advanced neuronal networks using deep learning, and potentially can be implemented in neuromorphic hardware.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data Availability

Data would be available through corresponding author with reasonable request.

References

  1. Sengupta B, Stemmler MB, Friston KJ. Information and efficiency in the nervous system—a synthesis. PLoS Comput Biol. 2013;9(7): e1003157.

    Article  Google Scholar 

  2. Amiri M, Nazari S, Faez K. Digital realization of the proposed linear model of the H odgkin-H uxley neuron. Int J Circuit Theory Appl. 2019;47(3):483–97.

    Article  Google Scholar 

  3. Diaz C, Sanchez G, Duchen G, Nakano M, Perez H. An efficient hardware implementation of a novel unary spiking neural network multiplier with variable dendritic delays. Neurocomputing. 2016;189:130–4.

    Article  Google Scholar 

  4. Wang Q, Li Y, Shao B, Dey S, Li P. Energy efficient parallel neuromorphic architectures with approximate arithmetic on FPGA. Neurocomputing. 2017;221:146–58.

    Article  Google Scholar 

  5. Haghiri S, Ahmadi A, Saif M. VLSI implementable neuron-astrocyte control mechanism. Neurocomputing. 2016;214:280–96.

    Article  Google Scholar 

  6. Maguire LP, McGinnity TM, Glackin B, Ghani A, Belatreche A, Harkin J. Challenges for large-scale implementations of spiking neural networks on FPGAs. Neurocomputing. 2007;71(1):13–29.

    Article  Google Scholar 

  7. Indiveri G, Liu SC. Memory and information processing in neuromorphic systems. Proc IEEE. 2015;103(8):1379–97.

    Article  Google Scholar 

  8. Merolla P, Arthur J, Akopyan F, Imam N, Manohar R, Modha DS. A digital neurosynaptic core using embedded crossbar memory with 45pJ per spike in 45nm. In Custom Integrated Circuits Conference (CICC) IEEE 2011. p. 1–4.

  9. Furber S. Large-scale neuromorphic computing systems. J Neural Eng. 2016;13(5): 051001.

    Article  Google Scholar 

  10. Azghadi MR, Iannella N, Al-Sarawi S, Abbott D. Tunable low energy, compact and high performance neuromorphic circuit for spike-based synaptic plasticity. PLoS ONE. 2014;9(2): e88326.

    Article  Google Scholar 

  11. Azghadi MR, Iannella N, Al-Sarawi SF, Indiveri G, Abbott D. Spike-based synaptic plasticity in silicon: design, implementation, application, and challenges. Proc IEEE. 2014;102(5):717–37.

    Article  Google Scholar 

  12. Qiao N, Mostafa H, Corradi F, Osswald M, Stefanini F, Sumislawska D, Indiveri G. A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses. Front Neurosci. 2015;9:141.

    Article  Google Scholar 

  13. McCormick DA, Connors BW, Lighthall JW, Prince DA. Comparative electrophysiology of pyramidal and sparsely spiny stellate neurons of the neocortex. J Neurophysiol. 1985;54(4):782–806.

    Article  Google Scholar 

  14. Yamazaki K, Vo-Ho VK, Bulsara D, Le N. Spiking neural networks and their applications: a Review. Brain Sci. 2022;12(7):863.

    Article  Google Scholar 

  15. Kattenborn T, Leitloff J, Schiefer F, Hinz S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J Photogramm Remote Sens. 2021;173:24–49.

    Article  Google Scholar 

  16. Blouw P, Eliasmith C. Event-driven signal processing with neuromorphic computing systems. In ICASSP 2020–2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020. p. 8534–8538. IEEE.

  17. Nazari S, Amiri M, Faez K, Van Hulle MM. Information transmitted from bioinspired Neuron-Astrocyte network improves cortical spiking Network’s pattern recognition performance. IEEE transactions on neural networks and learning systems. 2019;31(2):464–74.

    Article  MathSciNet  Google Scholar 

  18. Lee C, Sarwar SS, Panda P, Srinivasan G, Roy K. Enabling spike-based backpropagation for training deep neural network architectures. Front Neurosci. 2020;14:119.

    Article  Google Scholar 

  19. Chankyu Lee, Syed Shakib Sarwar, and Kaushik Roy. Enabling spike-based backpropagation in state-of-the-art deep neural network architectures. 2019. arXiv preprint arXiv:1903.06379.

  20. Jibin Wu, Yansong Chua, Malu Zhang, Guoqi Li, Haizhou Li, and Kay Chen Tan. A tandem learning rule for efficient and rapid inference on deep spiking neural networks. arXiv 2019. p. arXiv–1907.

  21. Wu Y, Deng L, Li G, Zhu J, Xie Y, Shi L. Direct training for spiking neural networks: faster, larger, better. Proc AAAI Conf Artif Intell. 2019;33(01):1311–1318.

  22. Zhang W, Li P. Temporal spike sequence learning via backpropagation for deep spiking neural networks. 2020. arXiv preprint arXiv:2002.10085.

  23. Rathi N, Roy K. Diet-snn: A low-latency spiking neural network with direct input encoding and leakage and threshold optimization. IEEE Transact Neural Netw Learn Syst. 2021.

  24. Chen X, Wang W, Bender C, Ding Y, Jia R, Li B, Song D. Refit: a unified watermark removal framework for deep learning systems with limited data. In Proceedings of the 2021 ACM Asia Conference on Computer and Communications Security 2021. p. 321-335.

  25. Mazzoni A, Panzeri S, Logothetis NK, Brunel N. Encoding of naturalistic stimuli by local field potential spectra in networks of excitatory and inhibitory neurons. PLoS Comput Biol. 2008;4(12): e1000239.

    Article  MathSciNet  Google Scholar 

  26. Neil D, Liu SC. Minitaur, an event-driven FPGA-based spiking network accelerator. IEEE Transact Very Large Scale Integr (VLSI) Syst. 2014;22(12):2621–2628.

  27. Diehl PU, Cook M. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neurosci. 2015;9.

  28. Tissera MD, McDonnell MD. Deep extreme learning machines: supervised autoencoding architecture for classification. Neurocomputing. 2016;174:42–9.

    Article  Google Scholar 

  29. Zhang M, Qu H, Xie X, Kurths J. Supervised learning in spiking neural networks with noise-threshold. Neurocomputing. 2017;219:333–49.

    Article  Google Scholar 

  30. Eshraghian JK, Cho K, Zheng C, Nam M, Iu HH, Lei W, Eshraghian K. Neuromorphic vision hybrid rram-cmos architecture. IEEE Transact Very Large Scale Integr (VLSI) Syst. 2018;26(12):2816–2829.

  31. Werginz P, Benav H, Zrenner E, Rattay F. Modeling the response of ON and OFF retinal bipolar cells during electric stimulation. Vision Res. 2015;111:170–81.

    Article  Google Scholar 

  32. Fohlmeister JF, Coleman PA, Miller RF. Modeling the repetitive firing of retinal ganglion cells. Brain Res. 1990;510(2):343–5.

    Article  Google Scholar 

  33. Braitenberg V, Schüz A. Anatomy of the cortex: statistics and geometry. 2013;18. Springer Science & Business Media.

  34. Tuckwell HC. Introduction to theoretical neurobiology: volume 2, nonlinear and stochastic theories. 2005;8. Cambridge University Press.

  35. Nazari S, Faez K. Establishing the flow of information between two bio-inspired spiking neural networks. Inf Sci. 2019;477:80–99.

    Article  Google Scholar 

  36. Ardakani A, Condo C, Gross WJ. Sparsely-connected neural networks: towards efficient VLSI implementation of deep neural networks. 2016. arXiv preprint arXiv:1611.01427.

  37. Sjöström PJ, Turrigiano GG, Nelson SB. Rate, timing, and cooperativity jointly determine cortical synaptic plasticity. Neuron. 2001;32(6):1149–64.

    Article  Google Scholar 

  38. Holmgren C, Harkany T, Svennenfors B, Zilberter Y. Pyramidal cell communication within local networks in layer 2/3 of rat neocortex. J Physiol. 2003;551(1):139–53.

    Article  Google Scholar 

  39. Diaz C, Frias T, Sanchez G, Perez H, Toscano K, Duchen G. A novel parallel multiplier using spiking neural P systems with dendritic delays. Neurocomputing. 2017;239:113–21.

    Article  Google Scholar 

  40. Chen Q, Wang J, Yang S, Qin Y, Deng B, Wei X. A real-time FPGA implementation of a biologically inspired central pattern generator network. Neurocomputing. 2017;244:63–80.

    Article  Google Scholar 

  41. Sidaty N, Larabi MC, Saadane A. Toward an audiovisual attention model for multimodal video content. Neurocomputing. 2017.

  42. Eskandari E, Ahmadi A, Gomar S. Effect of spike-timing-dependent plasticity on neural assembly computing. Neurocomputing. 2016;191:107–16.

    Article  Google Scholar 

  43. Ferrández JM, Lorente V, de la Paz F, Fernández E. Training biological neural cultures: Towards Hebbian learning. Neurocomputing. 2013;114:3–8.

    Article  Google Scholar 

  44. Bi GQ, Poo MM. Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J Neurosci. 1998;18(24):10464–72.

    Article  Google Scholar 

  45. Shepherd JD, Huganir RL. The cell biology of synaptic plasticity: AMPA receptor trafficking. Annu Rev Cell Dev Biol. 2007;23:613–43.

    Article  Google Scholar 

  46. Darian-Smith C, Gilbert CD. Axonal sprouting accompanies functional reorganization in adult cat striate cortex. Nature. 1994;368(6473):737–40.

    Article  Google Scholar 

  47. Skangiel-Kramska J, Głażewski S, Jabłońska B, Siucińska E, Kossut M. Reduction of GABA A receptor binding of [3 H] muscimol in the barrel field of mice after peripheral denervation: transient and long-lasting effects. Exp Brain Res. 1994;100(1):39–46.

    Article  Google Scholar 

  48. Sczesny-Kaiser M, Beckhaus K, Dinse HR, Schwenkreis P, Tegenthoff M, Höffken O. Repetitive transcranial direct current stimulation induced excitability changes of primary visual cortex and visual learning effects—a pilot study. Front Behavior Neurosci. 2016;10.

  49. Falcone B, Coffman BA, Clark VP, Parasuraman R. Transcranial direct current stimulation augments perceptual sensitivity and 24-hour retention in a complex threat detection task. PLoS ONE. 2012;7(4): e34993.

    Article  Google Scholar 

  50. Coffman BA, Clark VP, Parasuraman R. Battery powered thought: enhancement of attention, learning, and memory in healthy adults using transcranial direct current stimulation. Neuroimage. 2014;85:895–908.

    Article  Google Scholar 

  51. O'Connor P, Neil D, Liu SC, Delbruck T, Pfeiffer M. Real-time classification and sensor fusion with a spiking deep belief network. Front Neurosci. 2013;7

  52. Lin Z, Ma D, Meng J, Chen L. Relative ordering learning in spiking neural network for pattern recognition. Neurocomputing. 2017.

  53. Brader JM, Senn W, Fusi S. Learning real-world stimuli in a neural network with spike-driven synaptic dynamics. Neural Comput. 2007;19(11):2881–912.

    Article  MathSciNet  MATH  Google Scholar 

  54. Beyeler M, Dutt ND, Krichmar JL. Categorization and decision-making in a neurobiologically plausible spiking network using a STDP-like learning rule. Neural Netw. 2013;48:109–24.

    Article  Google Scholar 

  55. Querlioz D, Bichler O, Dollfus P, Gamrat C. Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans Nanotechnol. 2013;12(3):288–95.

    Article  Google Scholar 

  56. Nazari S. Spiking pattern recognition using informative signal of image and unsupervised biologically plausible learning. Neurocomputing. 2019;330:196–211.

    Article  Google Scholar 

  57. Jin Y, Zhang W, Li P. Hybrid macro/micro level backpropagation for training deep spiking neural networks. Adv Neural Inform Process Syst. 2018;31.

  58. Ngu HCV, Lee KM. Effective conversion of a convolutional neural network into a spiking neural network for image recognition tasks. Appl Sci. 2022;12(11):5749.

    Article  Google Scholar 

  59. Lee C, Panda P, Srinivasan G, Roy K. Training deep spiking convolutional neural networks with stdp-based unsupervised pre-training followed by supervised fine-tuning. Front Neurosci. 2018;12:435.

    Article  Google Scholar 

  60. Diehl PU, Neil D, Binas J, Cook M, Liu SC, Pfeiffer M. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In 2015 International Joint Conference on Neural Networks (IJCNN) 2015. pp. 1–8. IEEE.

  61. Wu Y, Deng L, Li G, Zhu J, Shi L. Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci. 2018;12:331.

    Article  Google Scholar 

  62. Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T. STDP-based spiking deep convolutional neural networks for object recognition. Neural Netw. 2018;99:56–67.

    Article  Google Scholar 

  63. Tavanaei A, Maida A. BP-STDP: Approximating backpropagation using spike timing dependent plasticity. Neurocomputing. 2019;330:39–47.

    Article  Google Scholar 

  64. Lee C, Srinivasan G, Panda P, Roy K. Deep spiking convolutional neural network trained with unsupervised spike-timing-dependent plasticity. IEEE Transactions on Cognitive and Developmental Systems. 2018;11(3):384–94.

    Google Scholar 

  65. Ciresan DC, Meier U, Gambardella LM, Schmidhuber J. Convolutional neural network committees for handwritten character classification. In 2011 International conference on document analysis and recognition, Beijing, China. 2011. pp. 1135-1139.

  66. Dufourq E, Bassett BA. EDEN: evolutionary deep networks for efficient machine learning. arXiv 2017, arXiv:1709.09161.

  67. Cavalin P, Oliveira L. Confusion matrix-based building of hierarchical classification. In Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications; Lecture Notes in Computer Science; Springer: Berlin, Germany. 2019;11401:271–278.

  68. Singh S, Paul A, Arun M. Parallelization of digit recognition system using Deep Convolutional Neural Network on CUDA. In Proceedings of the 2017 Third International Conference on Sensing, Signal Processing and Security, Chennai, India. 4–5 May 2017. pp. 379–383.

  69. Baldominos A, Saez Y, Isasi P. Hybridizing evolutionary computation and deep neural networks: an approach to handwriting recognition using committees and transfer learning. Complexity 2019. 2019;2952304.

  70. Peng Y, Yin H. Markov random field based convolutional neuralx networks for image classification. In IDEAL 2017: Intelligent Data Engineering and Automated Learning; Lecture Notes in Computer Science; Yin H, Gao Y, Chen S, Wen Y, Cai G, Gu T, Du J, Tallón-Ballesteros A, Zhang M, editors. Springer: Guilin, China. 2017;10585:387–396.

  71. Sabour S, Frosst N, Hinton GE. Dynamic routing between capsules. In Advances in Neural Information Processing Systems 30; NIPS Proceedings; Neural Information Processing Systems Foundation, Inc.: San Diego, CA, USA. 2017. pp. 548–556.

  72. Kabir HD, Abdar M, Khosravi A, Jalali SMJ, Atiya AF, Nahavandi S, Srinivasan D. Spinalnet: Deep neural network with gradual input. IEEE Transact Artif Intell. 2022.

  73. Vaila R, Chiasson J, Saxena V. A deep unsupervised feature learning spiking neural network with binarized classification layers for the EMNIST classification. IEEE Transact Emerg Topics Comput Intell. 2020.

  74. Baldominos A, Saez Y, Isasi P. A survey of handwritten character recognition with mnist and emnist. Appl Sci. 2019;9(15):3169.

    Article  Google Scholar 

  75. Neftci E, Das S, Pedroni B, Kreutz-Delgado K, Cauwenberghs G. Event-driven contrastive divergence for spiking neuromorphic systems. 2013.

  76. Uçar MK, Nour M, Sindi H, Polat K. The effect of training and testing process on machine learning in biomedical datasets. Math Probl Eng. 2020

  77. Sengupta A, Ye Y, Wang R, Liu C, Roy K. Going deeper in spiking neural networks: VGG and residual architectures. Front Neurosci. 2019;13:95.

    Article  Google Scholar 

  78. Rueckauer B, Lungu IA, Hu Y, Pfeiffer M, Liu SC. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci. 2017;11:682.

    Article  Google Scholar 

  79. Rathi N, Srinivasan G, Panda P, Roy K. Enabling deep spiking neural networks with hybrid conversion and spike timing dependent backpropagation. 2020. arXiv preprint arXiv:2005.01807.

  80. Nazari S, Faez K, Janahmadi M. A new approach to detect the coding rule of the cortical spiking model in the information transmission. Neural Netw. 2018;99:68–78.

    Article  Google Scholar 

  81. Martin SJ, Grimwood PD, Morris RG. Synaptic plasticity and memory: an evaluation of the hypothesis. Annu Rev Neurosci. 2000;23(1):649–711.

    Article  Google Scholar 

  82. Malenka RC, Bear MF. LTP and LTD: an embarrassment of riches. Neuron. 2004;44(1):5–21.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Amir Homayoun Jafari‬, Bahador Makkiabadi or Soheila Nazari.

Ethics declarations

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Amiri, M., Jafari‬, A.H., Makkiabadi, B. et al. A Novel Unsupervised Spatial–Temporal Learning Mechanism in a Bio-inspired Spiking Neural Network. Cogn Comput 15, 694–709 (2023). https://doi.org/10.1007/s12559-022-10097-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-022-10097-1

Keywords

Navigation