Spiking Neural Computing in Memristive Neuromorphic Platforms

  • Mahyar ShahsavariEmail author
  • Philippe Devienne
  • Pierre Boulet


Neuromorphic computation using  Spiking Neural Networks (SNN) is proposed as an alternative solution for future of computation to conquer the memory bottelneck issue in recent computer architecture. Different spike codings have been discussed to improve data transferring and data processing in neuro-inspired computation paradigms. Choosing the appropriate neural network topology could result in better performance of computation, recognition and classification. The model of the neuron is another important factor to design and implement SNN systems. The speed of simulation and implementation, ability of integration to the other elements of the network, and suitability for scalable networks are the factors to select a neuron model. The learning algorithms are significant consideration to train the neural network for weight modification. Improving learning in neuromorphic architecture is feasible by improving the quality of artificial synapse as well as learning algorithm such as STDP. In this chapter we proposed a new synapse box that can remember and forget. Furthermore, as the most frequent used unsupervised method for network training in SNN is STDP, we analyze and review the various methods of STDP. The sequential order of pre- or postsynaptic spikes occurring across a synapse in an interval of time leads to defining different STDP methods. Based on the importance of stability as well as Hebbian competition or anti-Hebbian competition the method will be used in weight modification. We survey the most significant projects that cause making neuromorphic platform. The advantages and disadvantages of each neuromorphic platform are introduced in this chapter.


  1. 1.
    Indiveri, G., Liu, S.C.: Memory and Information Processing in Neuromorphic Systems. Proc. IEEE 103(8), 1379 (2015). ArXiv: 1506.03264CrossRefGoogle Scholar
  2. 2.
    Rajendran, B., Alibart, F.: Neuromorphic Computing Based on Emerging Memory Technologies. IEEE J. Emerg. Sel. Top. Circuits Syst. 6(2), 198 (2016). Scholar
  3. 3.
    Lapicque, L.: Recherches quantitatives sur l’excitation électrique des nerfs traitée comme une polarisation. J. Physiol. Pathol. Gen. 9, 620 (1907).
  4. 4.
    Mead, C.: Neuromorphic electronic systems. Proc. IEEE 78(10), 1629 (1990). Scholar
  5. 5.
    Mahowald, M.A., Mead, C.: The silicon retina. Sci. Am. 264(5), 76 (1991)CrossRefGoogle Scholar
  6. 6.
    Delbruck, T., Mead, C.A.: Adaptive photoreceptor with wide dynamic range. In: Proceedings of IEEE International Symposium on Circuits and Systems—ISCAS ’94, vol. 4, pp. 339–342 (1994).
  7. 7.
    Sarpeshkar, R., Lyon, R.F., Mead, C.: A low-power wide-dynamic-range analog VLSI cochlea. In: Lande, T.S. (ed.) Neuromorphic Systems Engineering: Neural Networks in Silicon, pp. 49–103. Kluwer Academic, Boston, MA (1998).
  8. 8.
    Chiu, S.W., Tang, K.T.: Towards a chemiresistive sensor-integrated electronic nose: a review. Sensors 13(10), 14214 (2013). Scholar
  9. 9.
    Liu, S.C., Schaik, A.V., Minch, B.A., Delbruck, T.: Asynchronous binaural spatial audition sensor with 2\(\,\times \, \)64\(\,\times \, \)4 channel output. IEEE Trans. Biomed. Circuits Syst. 8(4), 453 (2014).
  10. 10.
    Vanarse, A., Osseiran, A., Rassau, A.: A review of current neuromorphic approaches for vision, auditory, and olfactory sensors. Front. Neurosci. 10, (2016).
  11. 11.
    Merolla, P.A., Arthur, J.V., Alvarez-Icaza, R., Cassidy, A.S., Sawada, J., Akopyan, F., Jackson, B.L., Imam, N., Guo, C., Nakamura, Y., Brezzo, B., Vo, I., Esser, S.K., Appuswamy, R., Taba, B., Amir, A., Flickner, M.D., Risk, W.P., Manohar, R., Modha, D.S.: A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345(6197), 668 (2014). Scholar
  12. 12.
    Furber, S.B., Galluppi, F., Temple, S., Plana, L.A.: The spinnaker project. Proc. IEEE 102(5), 652 (2014). Scholar
  13. 13.
    Jo, S.H., Chang, T., Ebong, I., Bhadviya, B.B., Mazumder, P., Lu, W.: Nanoscale memristor device as synapse in neuromorphic systems. Nano Lett. 10(4), 1297 (2010). Scholar
  14. 14.
    Shi, L., Pei, J., Deng, N., Wang, D., Deng, L., Wang, Y., Zhang, Y., Chen, F., Zhao, M., Song, S., Zeng, F., Li, G., Li, H., Ma, C.: Development of a neuromorphic computing system. In: 2015 IEEE International Electron Devices Meeting (IEDM), pp. 4.3.1–4.3.4 (2015)Google Scholar
  15. 15.
    O’Connor, P., Neil, D., Liu, S.C., Delbruck, T., Pfeiffer, M.: Real-time classification and sensor fusion with a spiking deep belief network. Neuromorphic Eng. 7, 178 (2013).
  16. 16.
    Prezioso, M., Merrikh-Bayat, F., Hoskins, B., Adam, G., Likharev, K.K., Strukov, D.B.: Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature 521(7550), 61 (2015). ArXiv:1412.0611CrossRefGoogle Scholar
  17. 17.
    Benjamin, B.V., Gao, P., McQuinn, E., Choudhary, S., Chandrasekaran, A.R., Bussat, J.M., Alvarez-Icaza, R., Arthur, J.V., Merolla, P.A., Boahen, K.: Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations. Proc. IEEE 102(5), 699 (2014). Scholar
  18. 18.
    Kempter, R., Gerstner, W., van Hemmen, J.L.: Hebbian learning and spiking neurons. Phys. Rev. E 59(4), 4498 (1999). Scholar
  19. 19.
    Song, S., Miller, K.D., Abbott, L.F.: Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nat. Neurosci. 3(9), 919 (2000). Scholar
  20. 20.
    McCulloch, W.S., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics 5(4), 115 (1943). Scholar
  21. 21.
    VanRullen, R., Guyonneau, R., Thorpe, S.J.: Spike times make sense. Trends Neurosci. 28(1), 1 (2005). Scholar
  22. 22.
    Brette, R.: Philosophy of the spike: rate-based vs. spike-based theories of the brain. Front. Syst. Neurosci. 151 (2015).
  23. 23.
    Bohte, S.M.: The evidence for neural information processing with precise spike-times: a survey. Nat. Comput. 3(2), 195 (2004). Scholar
  24. 24.
    Ponulak, F., Kasiski, A.: Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting. Neural Comput. 22(2), 467 (2010). Scholar
  25. 25.
    Masquelier, T., Hugues, E., Deco, G., Thorpe, S.: Oscillations, phase-of-firing coding, and spike timing-dependent plasticity: an efficient learning scheme. J. Neurosci. 29(43), 13484 (2009). Scholar
  26. 26.
    Chen, H.T., Ng, K.T., Bermak, A., Law, M.K., Martinez, D.: Spike latency coding in biologically inspired microelectronic nose. IEEE Trans. Biomed. Circuits Syst. 5(2), 160 (2011). Scholar
  27. 27.
    Thorpe, S., Delorme, A., Van Rullen, R.: Spike-based strategies for rapid processing. Neural Netw. 14(6–7), 715 (2001). Scholar
  28. 28.
    Shamir, M.: Emerging principles of population coding: in search for the neural code. Curr. Opin. Neurobiol. 25, 140 (2014). Scholar
  29. 29.
    Olshausen, B.A., Field, D.J.: Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381(6583), 607 (1996). Scholar
  30. 30.
    Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall PTR, Upper Saddle River, NJ, USA (1998)Google Scholar
  31. 31.
    Lipton, Z.C., Berkowitz, J., Elkan, C.: A critical review of recurrent neural networks for sequence learning. arXiv:1506.00019 [cs] (2015). ArXiv: 1506.00019
  32. 32.
    Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. 79(8), 2554 (1982). Scholar
  33. 33.
    Elman, J.L.: Cogn. Sci. 14(2), 179 (1990).
  34. 34.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278 (1998). Scholar
  35. 35.
    Krizhevsky, A., Sutskever, I., Hinton, G.: Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2, 1097–1105 (2012)Google Scholar
  36. 36.
    Hinton, G.E., Osindero, S., Teh, Y.W.: A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527 (2006). Scholar
  37. 37.
    Neil, D., Liu, S.C.: Minitaur, an event-driven FPGA-based spiking network accelerator. IEEE Trans. Very Large Scale Integr. (VLSI) Syst. 22(12), 2621 (2014). Scholar
  38. 38.
    Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117(4), 500 (1952). Scholar
  39. 39.
    Chicca, E., Badoni, D., Dante, V., D’Andreagiovanni, M., Salina, G., Carota, L., Fusi, S., Giudice, P.D.: A VLSI recurrent network of integrate-and-fire neurons connected by plastic synapses with long-term memory. IEEE Trans. Neural Netw. 14(5), 1297 (2003). Scholar
  40. 40.
    Liu, S.C., Douglas, R.: Temporal coding in a silicon network of integrate-and-fire neurons. IEEE Trans. Neural Netw. 15(5), 1305 (2004). Scholar
  41. 41.
    Maass, W., Bishop, C.M. (eds.): Pulsed Neural Networks. MIT Press, Cambridge, MA, USA (1999)zbMATHGoogle Scholar
  42. 42.
    Brette, R., Rudolph, M., Carnevale, T., Hines, M., Beeman, D., Bower, J.M., Diesmann, M., Morrison, A., Goodman, P.H., Harris, F.C., Zirpe, M., Natschlger, T., Pecevski, D., Ermentrout, B., Djurfeldt, M., Lansner, A., Rochel, O., Vieville, T., Muller, E., Davison, A.P., El Boustani, S., Destexhe, A.: Simulation of networks of spiking neurons: a review of tools and strategies. J. Comput. Neurosci. 23(3), 349 (2007). Scholar
  43. 43.
    Izhikevich, E.M.: Simple model of spiking neurons. IEEE Trans. Neural Netw./a publication of the IEEE Neural Networks Council 14(6), 1569 (2003). Scholar
  44. 44.
    Hu, E.H., Bloomfield, S.A.: Gap junctional coupling underlies the short-latency spike synchrony of retinal \(\alpha \) ganglion cells. J. Neurosci.: Official J. Soc. Neurosci. 23(17), 6768 (2003). Scholar
  45. 45.
    Chua, L.: Memristor-the missing circuit element. IEEE Trans. Circuit Theory 18(5), 507 (1971). Scholar
  46. 46.
    Strukov, D.B., Snider, G.S., Stewart, D.R., Williams, R.S.: The missing memristor found. Nature 453(7191), 80 (2008). Scholar
  47. 47.
    Indiveri, G., Linares-Barranco, B., Legenstein, R., Deligeorgis, G., Prodromakis, T.: Integration of nanoscale memristor synapses in neuromorphic computing architectures. Nanotechnology 24(38), 384010 (2013). ArXiv: 1302.7007CrossRefGoogle Scholar
  48. 48.
    Alibart, F., Pleutin, S., Gurin, D., Novembre, C., Lenfant, S., Lmimouni, K., Gamrat, C., Vuillaume, D.: An organic nanoparticle transistor behaving as a biological spiking synapse. Adv. Funct. Mater. 20(2), 330 (2010). Scholar
  49. 49.
    Alibart, F., Pleutin, S., Bichler, O., Gamrat, C., Serrano-Gotarredona, T., Linares-Barranco, B., Vuillaume, D.: A memristive nanoparticle/organic hybrid synapstor for neuroinspired computing. Adv. Funct. Mater. 22(3), 609 (2012). Scholar
  50. 50.
    Desbief, S., Kyndiah, A., Gurin, D., Gentili, D., Murgia, M., Lenfant, S., Alibart, F., Cramer, T., Biscarini, F., Vuillaume, D.: Low voltage and time constant organic synapse-transistor. Org. Electron. 21, 47 (2015). Scholar
  51. 51.
    Querlioz, D., Bichler, O., Dollfus, P., Gamrat, C.: Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans. Nanotechnol. 12(3), 288 (2013). Scholar
  52. 52.
    Shahsavari, M., Faisal Nadeem, M., Arash Ostadzadeh, S., Devienne, P., Boulet, P.: Unconventional digital computing approach: memristive nanodevice platform. Phys. Status Solidi (c) 12(1–2), 222 (2015). Scholar
  53. 53.
    Querlioz, D., Dollfus, P., Bichler, O., Gamrat, C.: Learning with memristive devices: How should we model their behavior?. In: 2011 IEEE/ACM International Symposium on Nanoscale Architectures, pp. 150–156 (2011).
  54. 54.
    Shahsavari, M., Falez, P., Boulet, P.: Combining a volatile and nonvolatile memristor in artificial synapse to improve learning in spiking neural networks. In: 12th ACM/IEEE International Symposium on Nanoscale Architectures (Nanoarch 2016), Beijing, China (2016).
  55. 55.
    Drachman, D.A.: Do we have brain to spare?. Neurology 64(12), 2004 (2005). Scholar
  56. 56.
    Morris, R.G.: Hebb, D.O: the organization of behavior. Brain Res. Bull. 50(5–6), 437, Wiley, New York, 1949 (1999). Scholar
  57. 57.
    Hebb, D.: Organization of behavior. Wiley, New York (3) (1949).<307::AID-JCLP2270060338>3.0.CO;2-K
  58. 58.
    Morrison, A., Diesmann, M., Gerstner, W.: Phenomenological models of synaptic plasticity based on spike timing. Biol. Cybern. 98(6), 459 (2008). Scholar
  59. 59.
    Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. The MIT Press (2005)Google Scholar
  60. 60.
    Yger, P., Gilson, M.: Models of metaplasticity: a review of concepts. Front. Comput. Neurosci. 138 (2015).
  61. 61.
    Zhang, L.I., Tao, H.W., Holt, C.E., Harris, W.A., Poo, M.M.: A critical window for cooperation and competition among developing retinotectal synapses. Nature 395(6697), 37 (1998). Scholar
  62. 62.
    Wang, H.X., Gerkin, R.C., Nauen, D.W., Bi, G.Q.: Coactivation and timing-dependent integration of synaptic potentiation and depression. Nat. Neurosci. 8(2), 187 (2005). Scholar
  63. 63.
    Babadi, B., Abbott, L.F.: Stability and competition in multi-spike models of spike-timing dependent plasticity. PLOS Comput. Biol. 12(3), e1004750 (2016). Scholar
  64. 64.
    Pfister, J.P., Gerstner, W.: Triplets of spikes in a model of spike timing-dependent plasticity. J. Neurosci. 26(38), 9673 (2006). Scholar
  65. 65.
    Bienenstock, E.L., Cooper, L.N., Munro, P.W.: Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J. Neurosci.: Official J. Soc. Neurosci. 2(1), 32 (1982)CrossRefGoogle Scholar
  66. 66.
    Froemke, R.C., Dan, Y.: Spike-timing-dependent synaptic modification induced by natural spike trains. Nature 416(6879), 433 (2002). Scholar
  67. 67.
    Sj\({\ddot{\text{o}}}\)str\({\ddot{\text{ o }}}\)m, P.J., Turrigiano, G.G., Nelson, S.B.: Endocannabinoid-dependent neocortical layer-5 LTD in the absence of postsynaptic spiking. J. Neurophysiol. 92(6), 3338 (2004). Scholar
  68. 68.
    Senn, W., Markram, H., Tsodyks, M.: An algorithm for modifying neurotransmitter release probability based on pre- and postsynaptic spike timing. Neural Comput. 13(1), 35 (2001). Scholar
  69. 69.
    Frgnac, Y., Shulz, D.E.: Activity-dependent regulation of receptive field properties of cat area 17 by supervised Hebbian learning. J. Neurobiol. 41(1), 69 (1999). Scholar
  70. 70.
    Pfister, J.P., Toyoizumi, T., Barber, D., Gerstner, W.: Optimal spike-timing-dependent plasticity for precise action potential firing in supervised learning. Neural Comput. 18(6), 1318 (2006). Scholar
  71. 71.
    Schemmel, J., Briiderle, D., Griibl, A., Hock, M., Meier, K., Millner, S.: A wafer-scale neuromorphic hardware system for large-scale neural modeling. IEEE, pp. 1947–1950 (2010).
  72. 72.
    Boahen, K.: Point-to-point connectivity between neuromorphic chips using address events. IEEE Trans. Circuits Syst. II: Analog Digit. Sig. Process. 47(5), 416 (2000). Scholar
  73. 73.
    Wendt, K., Ehrlich, M., Schffny, R.: A graph theoretical approach for a multistep mapping software for the facets project. In: Proceedings of the 2nd WSEAS International Conference on Computer Engineering and Applications. World Scientific and Engineering Academy and Society (WSEAS), Stevens Point, Wisconsin, USA, 2008, CEA’08, pp. 189–194.
  74. 74.
    Yang, J., Pickett, M., Li, X., Ohlberg, D., Stewart, D., Williams, R.: Memristive switching mechanism for metal/oxide/metal nanodevices. Nat. Nanotechnol. 3(7), 429 (2008).
  75. 75.
    Borghetti, J., Li, Z., Straznicky, J., Li, X., Ohlberg, D.A.A., Wu, W., Stewart, D.R., Williams, R.S.: A hybrid nanomemristor/transistor logic circuit capable of self-programming. Proc. Natl. Acad. Sci. 106(6), 1699 (2009). Scholar
  76. 76.
    Marder, E., Goaillard, J.M.: Variability, compensation and homeostasis in neuron and network function. Nat. Rev. Neurosci. 7(7), 563 (2006). Scholar
  77. 77.
    Li, C., Li, Y.: A review on synergistic learning. IEEE Access 4, 119 (2016). Scholar
  78. 78.
    Grossberg, S.: Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors. Biol. Cybern. 23(3), 121 (1976). Scholar
  79. 79.
    Maass, W.: On the computational power of winner-take-all . Neural Comput. 12(11), 2519 (2000). Scholar
  80. 80.
    Jin, D.Z., Seung, H.S.: Fast computation with spikes in a recurrent neural network. Phys. Rev. E Stat. Nonlin. Soft Matter Phys. 65(5 Pt 1), 051922 (2002).
  81. 81.
    Oster, M., Liu, S.C.: A winner-take-all spiking network with spiking inputs. In: Proceedings of the 2004 11th IEEE International Conference on Electronics, Circuits and Systems, 2004. ICECS 2004, pp. 203–206 (2004).
  82. 82.
    Hafliger, P.: Adaptive WTA with an analog VLSI neuromorphic learning chip. IEEE Trans. Neural Netw. 18(2), 551 (2007). Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Mahyar Shahsavari
    • 1
    Email author
  • Philippe Devienne
    • 1
  • Pierre Boulet
    • 1
  1. 1.CRIStAL, Centre de Recherche en Informatique Signal et Automatique de LilleUniversité Lille, CNRS, Centrale Lille, UMR 9189LilleFrance

Personalised recommendations