Advertisement

Spiking Neural Computing in Memristive Neuromorphic Platforms

  • Mahyar ShahsavariEmail author
  • Philippe Devienne
  • Pierre Boulet
Chapter

Abstract

Neuromorphic computation using  Spiking Neural Networks (SNN) is proposed as an alternative solution for future of computation to conquer the memory bottelneck issue in recent computer architecture. Different spike codings have been discussed to improve data transferring and data processing in neuro-inspired computation paradigms. Choosing the appropriate neural network topology could result in better performance of computation, recognition and classification. The model of the neuron is another important factor to design and implement SNN systems. The speed of simulation and implementation, ability of integration to the other elements of the network, and suitability for scalable networks are the factors to select a neuron model. The learning algorithms are significant consideration to train the neural network for weight modification. Improving learning in neuromorphic architecture is feasible by improving the quality of artificial synapse as well as learning algorithm such as STDP. In this chapter we proposed a new synapse box that can remember and forget. Furthermore, as the most frequent used unsupervised method for network training in SNN is STDP, we analyze and review the various methods of STDP. The sequential order of pre- or postsynaptic spikes occurring across a synapse in an interval of time leads to defining different STDP methods. Based on the importance of stability as well as Hebbian competition or anti-Hebbian competition the method will be used in weight modification. We survey the most significant projects that cause making neuromorphic platform. The advantages and disadvantages of each neuromorphic platform are introduced in this chapter.

References

  1. 1.
    Indiveri, G., Liu, S.C.: Memory and Information Processing in Neuromorphic Systems. Proc. IEEE 103(8), 1379 (2015).  https://doi.org/10.1109/JPROC.2015.2444094. ArXiv: 1506.03264CrossRefGoogle Scholar
  2. 2.
    Rajendran, B., Alibart, F.: Neuromorphic Computing Based on Emerging Memory Technologies. IEEE J. Emerg. Sel. Top. Circuits Syst. 6(2), 198 (2016).  https://doi.org/10.1109/JETCAS.2016.2533298CrossRefGoogle Scholar
  3. 3.
    Lapicque, L.: Recherches quantitatives sur l’excitation électrique des nerfs traitée comme une polarisation. J. Physiol. Pathol. Gen. 9, 620 (1907). http://www.pubmedcentral.nih.gov/tocrender.fcgi?journal=484&action=archive
  4. 4.
    Mead, C.: Neuromorphic electronic systems. Proc. IEEE 78(10), 1629 (1990).  https://doi.org/10.1109/5.58356CrossRefGoogle Scholar
  5. 5.
    Mahowald, M.A., Mead, C.: The silicon retina. Sci. Am. 264(5), 76 (1991)CrossRefGoogle Scholar
  6. 6.
    Delbruck, T., Mead, C.A.: Adaptive photoreceptor with wide dynamic range. In: Proceedings of IEEE International Symposium on Circuits and Systems—ISCAS ’94, vol. 4, pp. 339–342 (1994).  https://doi.org/10.1109/ISCAS.1994.409266
  7. 7.
    Sarpeshkar, R., Lyon, R.F., Mead, C.: A low-power wide-dynamic-range analog VLSI cochlea. In: Lande, T.S. (ed.) Neuromorphic Systems Engineering: Neural Networks in Silicon, pp. 49–103. Kluwer Academic, Boston, MA (1998). http://resolver.caltech.edu/CaltechAUTHORS:20150112-105156628
  8. 8.
    Chiu, S.W., Tang, K.T.: Towards a chemiresistive sensor-integrated electronic nose: a review. Sensors 13(10), 14214 (2013).  https://doi.org/10.3390/s131014214. https://www.mdpi.com/1424-8220/13/10/14214/htmCrossRefGoogle Scholar
  9. 9.
    Liu, S.C., Schaik, A.V., Minch, B.A., Delbruck, T.: Asynchronous binaural spatial audition sensor with 2\(\,\times \, \)64\(\,\times \, \)4 channel output. IEEE Trans. Biomed. Circuits Syst. 8(4), 453 (2014).  https://doi.org/10.1109/TBCAS.2013.2281834
  10. 10.
    Vanarse, A., Osseiran, A., Rassau, A.: A review of current neuromorphic approaches for vision, auditory, and olfactory sensors. Front. Neurosci. 10, (2016).  https://doi.org/10.3389/fnins.2016.00115. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4809886/
  11. 11.
    Merolla, P.A., Arthur, J.V., Alvarez-Icaza, R., Cassidy, A.S., Sawada, J., Akopyan, F., Jackson, B.L., Imam, N., Guo, C., Nakamura, Y., Brezzo, B., Vo, I., Esser, S.K., Appuswamy, R., Taba, B., Amir, A., Flickner, M.D., Risk, W.P., Manohar, R., Modha, D.S.: A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345(6197), 668 (2014).  https://doi.org/10.1126/science.1254642.CrossRefGoogle Scholar
  12. 12.
    Furber, S.B., Galluppi, F., Temple, S., Plana, L.A.: The spinnaker project. Proc. IEEE 102(5), 652 (2014).  https://doi.org/10.1109/JPROC.2014.2304638CrossRefGoogle Scholar
  13. 13.
    Jo, S.H., Chang, T., Ebong, I., Bhadviya, B.B., Mazumder, P., Lu, W.: Nanoscale memristor device as synapse in neuromorphic systems. Nano Lett. 10(4), 1297 (2010).  https://doi.org/10.1021/nl904092hCrossRefGoogle Scholar
  14. 14.
    Shi, L., Pei, J., Deng, N., Wang, D., Deng, L., Wang, Y., Zhang, Y., Chen, F., Zhao, M., Song, S., Zeng, F., Li, G., Li, H., Ma, C.: Development of a neuromorphic computing system. In: 2015 IEEE International Electron Devices Meeting (IEDM), pp. 4.3.1–4.3.4 (2015)Google Scholar
  15. 15.
    O’Connor, P., Neil, D., Liu, S.C., Delbruck, T., Pfeiffer, M.: Real-time classification and sensor fusion with a spiking deep belief network. Neuromorphic Eng. 7, 178 (2013).  https://doi.org/10.3389/fnins.2013.00178
  16. 16.
    Prezioso, M., Merrikh-Bayat, F., Hoskins, B., Adam, G., Likharev, K.K., Strukov, D.B.: Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature 521(7550), 61 (2015).  https://doi.org/10.1038/nature14441. ArXiv:1412.0611CrossRefGoogle Scholar
  17. 17.
    Benjamin, B.V., Gao, P., McQuinn, E., Choudhary, S., Chandrasekaran, A.R., Bussat, J.M., Alvarez-Icaza, R., Arthur, J.V., Merolla, P.A., Boahen, K.: Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations. Proc. IEEE 102(5), 699 (2014).  https://doi.org/10.1109/JPROC.2014.2313565CrossRefGoogle Scholar
  18. 18.
    Kempter, R., Gerstner, W., van Hemmen, J.L.: Hebbian learning and spiking neurons. Phys. Rev. E 59(4), 4498 (1999).  https://doi.org/10.1103/PhysRevE.59.4498MathSciNetCrossRefGoogle Scholar
  19. 19.
    Song, S., Miller, K.D., Abbott, L.F.: Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nat. Neurosci. 3(9), 919 (2000).  https://doi.org/10.1038/78829CrossRefGoogle Scholar
  20. 20.
    McCulloch, W.S., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics 5(4), 115 (1943).  https://doi.org/10.1007/BF02478259MathSciNetzbMATHCrossRefGoogle Scholar
  21. 21.
    VanRullen, R., Guyonneau, R., Thorpe, S.J.: Spike times make sense. Trends Neurosci. 28(1), 1 (2005).  https://doi.org/10.1016/j.tins.2004.10.010. http://www.sciencedirect.com/science/article/pii/S0166223604003546CrossRefGoogle Scholar
  22. 22.
    Brette, R.: Philosophy of the spike: rate-based vs. spike-based theories of the brain. Front. Syst. Neurosci. 151 (2015).  https://doi.org/10.3389/fnsys.2015.00151
  23. 23.
    Bohte, S.M.: The evidence for neural information processing with precise spike-times: a survey. Nat. Comput. 3(2), 195 (2004).  https://doi.org/10.1023/B:NACO.0000027755.02868.60MathSciNetzbMATHCrossRefGoogle Scholar
  24. 24.
    Ponulak, F., Kasiski, A.: Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting. Neural Comput. 22(2), 467 (2010).  https://doi.org/10.1162/neco.2009.11-08-901MathSciNetzbMATHCrossRefGoogle Scholar
  25. 25.
    Masquelier, T., Hugues, E., Deco, G., Thorpe, S.: Oscillations, phase-of-firing coding, and spike timing-dependent plasticity: an efficient learning scheme. J. Neurosci. 29(43), 13484 (2009).  https://doi.org/10.1523/JNEUROSCI.2207-09.2009CrossRefGoogle Scholar
  26. 26.
    Chen, H.T., Ng, K.T., Bermak, A., Law, M.K., Martinez, D.: Spike latency coding in biologically inspired microelectronic nose. IEEE Trans. Biomed. Circuits Syst. 5(2), 160 (2011).  https://doi.org/10.1109/TBCAS.2010.2075928CrossRefGoogle Scholar
  27. 27.
    Thorpe, S., Delorme, A., Van Rullen, R.: Spike-based strategies for rapid processing. Neural Netw. 14(6–7), 715 (2001).  https://doi.org/10.1016/S0893-6080(01)00083-1. http://www.sciencedirect.com/science/article/pii/S0893608001000831CrossRefGoogle Scholar
  28. 28.
    Shamir, M.: Emerging principles of population coding: in search for the neural code. Curr. Opin. Neurobiol. 25, 140 (2014).  https://doi.org/10.1016/j.conb.2014.01.002. http://www.sciencedirect.com/science/article/pii/S0959438814000105CrossRefGoogle Scholar
  29. 29.
    Olshausen, B.A., Field, D.J.: Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381(6583), 607 (1996).  https://doi.org/10.1038/381607a0. http://www.nature.com/nature/journal/v381/n6583/abs/381607a0.htmlCrossRefGoogle Scholar
  30. 30.
    Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Prentice Hall PTR, Upper Saddle River, NJ, USA (1998)Google Scholar
  31. 31.
    Lipton, Z.C., Berkowitz, J., Elkan, C.: A critical review of recurrent neural networks for sequence learning. arXiv:1506.00019 [cs] (2015). ArXiv: 1506.00019
  32. 32.
    Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. 79(8), 2554 (1982). http://www.pnas.org/content/79/8/2554MathSciNetzbMATHCrossRefGoogle Scholar
  33. 33.
    Elman, J.L.: Cogn. Sci. 14(2), 179 (1990).  https://doi.org/10.1207/s15516709cog1402
  34. 34.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278 (1998).  https://doi.org/10.1109/5.726791CrossRefGoogle Scholar
  35. 35.
    Krizhevsky, A., Sutskever, I., Hinton, G.: Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2, 1097–1105 (2012)Google Scholar
  36. 36.
    Hinton, G.E., Osindero, S., Teh, Y.W.: A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527 (2006).  https://doi.org/10.1162/neco.2006.18.7.1527MathSciNetzbMATHCrossRefGoogle Scholar
  37. 37.
    Neil, D., Liu, S.C.: Minitaur, an event-driven FPGA-based spiking network accelerator. IEEE Trans. Very Large Scale Integr. (VLSI) Syst. 22(12), 2621 (2014).  https://doi.org/10.1109/TVLSI.2013.2294916CrossRefGoogle Scholar
  38. 38.
    Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117(4), 500 (1952). http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1392413/CrossRefGoogle Scholar
  39. 39.
    Chicca, E., Badoni, D., Dante, V., D’Andreagiovanni, M., Salina, G., Carota, L., Fusi, S., Giudice, P.D.: A VLSI recurrent network of integrate-and-fire neurons connected by plastic synapses with long-term memory. IEEE Trans. Neural Netw. 14(5), 1297 (2003).  https://doi.org/10.1109/TNN.2003.816367CrossRefGoogle Scholar
  40. 40.
    Liu, S.C., Douglas, R.: Temporal coding in a silicon network of integrate-and-fire neurons. IEEE Trans. Neural Netw. 15(5), 1305 (2004).  https://doi.org/10.1109/TNN.2004.832725CrossRefGoogle Scholar
  41. 41.
    Maass, W., Bishop, C.M. (eds.): Pulsed Neural Networks. MIT Press, Cambridge, MA, USA (1999)zbMATHGoogle Scholar
  42. 42.
    Brette, R., Rudolph, M., Carnevale, T., Hines, M., Beeman, D., Bower, J.M., Diesmann, M., Morrison, A., Goodman, P.H., Harris, F.C., Zirpe, M., Natschlger, T., Pecevski, D., Ermentrout, B., Djurfeldt, M., Lansner, A., Rochel, O., Vieville, T., Muller, E., Davison, A.P., El Boustani, S., Destexhe, A.: Simulation of networks of spiking neurons: a review of tools and strategies. J. Comput. Neurosci. 23(3), 349 (2007).  https://doi.org/10.1007/s10827-007-0038-6MathSciNetCrossRefGoogle Scholar
  43. 43.
    Izhikevich, E.M.: Simple model of spiking neurons. IEEE Trans. Neural Netw./a publication of the IEEE Neural Networks Council 14(6), 1569 (2003).  https://doi.org/10.1109/TNN.2003.820440MathSciNetCrossRefGoogle Scholar
  44. 44.
    Hu, E.H., Bloomfield, S.A.: Gap junctional coupling underlies the short-latency spike synchrony of retinal \(\alpha \) ganglion cells. J. Neurosci.: Official J. Soc. Neurosci. 23(17), 6768 (2003). https://www.jneurosci.org/content/23/17/6768CrossRefGoogle Scholar
  45. 45.
    Chua, L.: Memristor-the missing circuit element. IEEE Trans. Circuit Theory 18(5), 507 (1971).  https://doi.org/10.1109/TCT.1971.1083337CrossRefGoogle Scholar
  46. 46.
    Strukov, D.B., Snider, G.S., Stewart, D.R., Williams, R.S.: The missing memristor found. Nature 453(7191), 80 (2008).  https://doi.org/10.1038/nature06932. http://www.nature.com/nature/journal/v453/n7191/full/nature06932.htmlCrossRefGoogle Scholar
  47. 47.
    Indiveri, G., Linares-Barranco, B., Legenstein, R., Deligeorgis, G., Prodromakis, T.: Integration of nanoscale memristor synapses in neuromorphic computing architectures. Nanotechnology 24(38), 384010 (2013).  https://doi.org/10.1088/0957-4484/24/38/384010. ArXiv: 1302.7007CrossRefGoogle Scholar
  48. 48.
    Alibart, F., Pleutin, S., Gurin, D., Novembre, C., Lenfant, S., Lmimouni, K., Gamrat, C., Vuillaume, D.: An organic nanoparticle transistor behaving as a biological spiking synapse. Adv. Funct. Mater. 20(2), 330 (2010).  https://doi.org/10.1002/adfm.200901335CrossRefGoogle Scholar
  49. 49.
    Alibart, F., Pleutin, S., Bichler, O., Gamrat, C., Serrano-Gotarredona, T., Linares-Barranco, B., Vuillaume, D.: A memristive nanoparticle/organic hybrid synapstor for neuroinspired computing. Adv. Funct. Mater. 22(3), 609 (2012).  https://doi.org/10.1002/adfm.201101935. http://www.sciencedirect.com/science/article/pii/S1566119915000786CrossRefGoogle Scholar
  50. 50.
    Desbief, S., Kyndiah, A., Gurin, D., Gentili, D., Murgia, M., Lenfant, S., Alibart, F., Cramer, T., Biscarini, F., Vuillaume, D.: Low voltage and time constant organic synapse-transistor. Org. Electron. 21, 47 (2015).  https://doi.org/10.1016/j.orgel.2015.02.021. http://www.sciencedirect.com/science/article/pii/S1566119915000786CrossRefGoogle Scholar
  51. 51.
    Querlioz, D., Bichler, O., Dollfus, P., Gamrat, C.: Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans. Nanotechnol. 12(3), 288 (2013).  https://doi.org/10.1109/TNANO.2013.2250995CrossRefGoogle Scholar
  52. 52.
    Shahsavari, M., Faisal Nadeem, M., Arash Ostadzadeh, S., Devienne, P., Boulet, P.: Unconventional digital computing approach: memristive nanodevice platform. Phys. Status Solidi (c) 12(1–2), 222 (2015).  https://doi.org/10.1002/pssc.201400069CrossRefGoogle Scholar
  53. 53.
    Querlioz, D., Dollfus, P., Bichler, O., Gamrat, C.: Learning with memristive devices: How should we model their behavior?. In: 2011 IEEE/ACM International Symposium on Nanoscale Architectures, pp. 150–156 (2011).  https://doi.org/10.1109/NANOARCH.2011.5941497
  54. 54.
    Shahsavari, M., Falez, P., Boulet, P.: Combining a volatile and nonvolatile memristor in artificial synapse to improve learning in spiking neural networks. In: 12th ACM/IEEE International Symposium on Nanoscale Architectures (Nanoarch 2016), Beijing, China (2016).  https://doi.org/10.1145/2950067.2950090
  55. 55.
    Drachman, D.A.: Do we have brain to spare?. Neurology 64(12), 2004 (2005).  https://doi.org/10.1212/01.WNL.0000166914.38327.BBCrossRefGoogle Scholar
  56. 56.
    Morris, R.G.: Hebb, D.O: the organization of behavior. Brain Res. Bull. 50(5–6), 437, Wiley, New York, 1949 (1999).  https://doi.org/10.1016/S0361-9230(99)00182-3CrossRefGoogle Scholar
  57. 57.
    Hebb, D.: Organization of behavior. Wiley, New York (3) (1949). https://doi.org/10.1002/1097-4679(195007)6:3<307::AID-JCLP2270060338>3.0.CO;2-K
  58. 58.
    Morrison, A., Diesmann, M., Gerstner, W.: Phenomenological models of synaptic plasticity based on spike timing. Biol. Cybern. 98(6), 459 (2008).  https://doi.org/10.1007/s00422-008-0233-1. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2799003/MathSciNetzbMATHCrossRefGoogle Scholar
  59. 59.
    Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. The MIT Press (2005)Google Scholar
  60. 60.
    Yger, P., Gilson, M.: Models of metaplasticity: a review of concepts. Front. Comput. Neurosci. 138 (2015).  https://doi.org/10.3389/fncom.2015.00138
  61. 61.
    Zhang, L.I., Tao, H.W., Holt, C.E., Harris, W.A., Poo, M.M.: A critical window for cooperation and competition among developing retinotectal synapses. Nature 395(6697), 37 (1998).  https://doi.org/10.1038/25665. http://www.nature.com/nature/journal/v395/n6697/abs/395037a0.htmlCrossRefGoogle Scholar
  62. 62.
    Wang, H.X., Gerkin, R.C., Nauen, D.W., Bi, G.Q.: Coactivation and timing-dependent integration of synaptic potentiation and depression. Nat. Neurosci. 8(2), 187 (2005).  https://doi.org/10.1038/nn1387. http://www.nature.com/neuro/journal/v8/n2/abs/nn1387.htmlCrossRefGoogle Scholar
  63. 63.
    Babadi, B., Abbott, L.F.: Stability and competition in multi-spike models of spike-timing dependent plasticity. PLOS Comput. Biol. 12(3), e1004750 (2016).  https://doi.org/10.1371/journal.pcbi.1004750. http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1004750CrossRefGoogle Scholar
  64. 64.
    Pfister, J.P., Gerstner, W.: Triplets of spikes in a model of spike timing-dependent plasticity. J. Neurosci. 26(38), 9673 (2006).  https://doi.org/10.1523/JNEUROSCI.1425-06.2006. http://www.jneurosci.org/content/26/38/9673CrossRefGoogle Scholar
  65. 65.
    Bienenstock, E.L., Cooper, L.N., Munro, P.W.: Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J. Neurosci.: Official J. Soc. Neurosci. 2(1), 32 (1982)CrossRefGoogle Scholar
  66. 66.
    Froemke, R.C., Dan, Y.: Spike-timing-dependent synaptic modification induced by natural spike trains. Nature 416(6879), 433 (2002).  https://doi.org/10.1038/416433a. http://www.nature.com/nature/journal/v416/n6879/abs/416433a.htmlCrossRefGoogle Scholar
  67. 67.
    Sj\({\ddot{\text{o}}}\)str\({\ddot{\text{ o }}}\)m, P.J., Turrigiano, G.G., Nelson, S.B.: Endocannabinoid-dependent neocortical layer-5 LTD in the absence of postsynaptic spiking. J. Neurophysiol. 92(6), 3338 (2004).  https://doi.org/10.1152/jn.00376.2004CrossRefGoogle Scholar
  68. 68.
    Senn, W., Markram, H., Tsodyks, M.: An algorithm for modifying neurotransmitter release probability based on pre- and postsynaptic spike timing. Neural Comput. 13(1), 35 (2001). https://www.ncbi.nlm.nih.gov/pubmed/11177427zbMATHCrossRefGoogle Scholar
  69. 69.
    Frgnac, Y., Shulz, D.E.: Activity-dependent regulation of receptive field properties of cat area 17 by supervised Hebbian learning. J. Neurobiol. 41(1), 69 (1999). https://www.ncbi.nlm.nih.gov/pubmed/10504194CrossRefGoogle Scholar
  70. 70.
    Pfister, J.P., Toyoizumi, T., Barber, D., Gerstner, W.: Optimal spike-timing-dependent plasticity for precise action potential firing in supervised learning. Neural Comput. 18(6), 1318 (2006).  https://doi.org/10.1162/neco.2006.18.6.1318MathSciNetzbMATHCrossRefGoogle Scholar
  71. 71.
    Schemmel, J., Briiderle, D., Griibl, A., Hock, M., Meier, K., Millner, S.: A wafer-scale neuromorphic hardware system for large-scale neural modeling. IEEE, pp. 1947–1950 (2010).  https://doi.org/10.1109/ISCAS.2010.5536970. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5536970
  72. 72.
    Boahen, K.: Point-to-point connectivity between neuromorphic chips using address events. IEEE Trans. Circuits Syst. II: Analog Digit. Sig. Process. 47(5), 416 (2000).  https://doi.org/10.1109/82.842110. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=842110zbMATHCrossRefGoogle Scholar
  73. 73.
    Wendt, K., Ehrlich, M., Schffny, R.: A graph theoretical approach for a multistep mapping software for the facets project. In: Proceedings of the 2nd WSEAS International Conference on Computer Engineering and Applications. World Scientific and Engineering Academy and Society (WSEAS), Stevens Point, Wisconsin, USA, 2008, CEA’08, pp. 189–194. http://dl.acm.org/citation.cfm?id=1373936.1373969
  74. 74.
    Yang, J., Pickett, M., Li, X., Ohlberg, D., Stewart, D., Williams, R.: Memristive switching mechanism for metal/oxide/metal nanodevices. Nat. Nanotechnol. 3(7), 429 (2008). http://www.scopus.com/inward/record.url?eid=2-s2.0-46749093701&partnerID=40&md5=f2a7152ab8e0922c0eabc0c44f89ee7b
  75. 75.
    Borghetti, J., Li, Z., Straznicky, J., Li, X., Ohlberg, D.A.A., Wu, W., Stewart, D.R., Williams, R.S.: A hybrid nanomemristor/transistor logic circuit capable of self-programming. Proc. Natl. Acad. Sci. 106(6), 1699 (2009).  https://doi.org/10.1073/pnas.0806642106CrossRefGoogle Scholar
  76. 76.
    Marder, E., Goaillard, J.M.: Variability, compensation and homeostasis in neuron and network function. Nat. Rev. Neurosci. 7(7), 563 (2006).  https://doi.org/10.1038/nrn1949. http://www.nature.com/nrn/journal/v7/n7/full/nrn1949.htmlCrossRefGoogle Scholar
  77. 77.
    Li, C., Li, Y.: A review on synergistic learning. IEEE Access 4, 119 (2016).  https://doi.org/10.1109/ACCESS.2015.2509005CrossRefGoogle Scholar
  78. 78.
    Grossberg, S.: Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors. Biol. Cybern. 23(3), 121 (1976).  https://doi.org/10.1007/BF00344744MathSciNetzbMATHCrossRefGoogle Scholar
  79. 79.
    Maass, W.: On the computational power of winner-take-all . Neural Comput. 12(11), 2519 (2000).  https://doi.org/10.1162/089976600300014827MathSciNetCrossRefGoogle Scholar
  80. 80.
    Jin, D.Z., Seung, H.S.: Fast computation with spikes in a recurrent neural network. Phys. Rev. E Stat. Nonlin. Soft Matter Phys. 65(5 Pt 1), 051922 (2002).  https://doi.org/10.1103/PhysRevE.65.051922
  81. 81.
    Oster, M., Liu, S.C.: A winner-take-all spiking network with spiking inputs. In: Proceedings of the 2004 11th IEEE International Conference on Electronics, Circuits and Systems, 2004. ICECS 2004, pp. 203–206 (2004).  https://doi.org/10.1109/ICECS.2004.1399650
  82. 82.
    Hafliger, P.: Adaptive WTA with an analog VLSI neuromorphic learning chip. IEEE Trans. Neural Netw. 18(2), 551 (2007).  https://doi.org/10.1109/TNN.2006.884676CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Mahyar Shahsavari
    • 1
    Email author
  • Philippe Devienne
    • 1
  • Pierre Boulet
    • 1
  1. 1.CRIStAL, Centre de Recherche en Informatique Signal et Automatique de LilleUniversité Lille, CNRS, Centrale Lille, UMR 9189LilleFrance

Personalised recommendations