Skip to main content
Log in

Neuromorphic Systems: Devices, Architecture, and Algorithms

  • Published:
Russian Microelectronics Aims and scope Submit manuscript

Abstract

The application of the structure and principles of the human brain opens up great opportunities for creating artificial systems based on silicon technology. The energy efficiency and performance of a biosimilar architecture can be significantly higher compared to the traditional von Neumann architecture. This paper presents an overview of the most promising artificial neural network (ANN) and spiking neural network (SNN) architectures for biosimilar systems, called neuromorphic systems. Devices for biosimilar systems, such as memristors and ferroelectric transistors, are considered for use as artificial synapses that determine the possibility of creating various architectures of neuromorphic systems; methods and rules for training structures to work correctly when mimicking biological learning rules, such as long-term synaptic plasticity. Problems hindering the implementation of biosimilar systems and examples of architectures that have been practically implemented are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.
Fig. 8.
Fig. 9.
Fig. 10.
Fig. 11.
Fig. 12.
Fig. 13.
Fig. 14.
Fig. 15.
Fig. 16.
Fig. 17.
Fig. 18.
Fig. 19.
Fig. 20.

REFERENCES

  1. Shipley, C. and Jodis, S., Programming Languages Classification, Encyclopedia of Information Systems, Bidgoli, H., Ed., Academic, 2003, pp. 545–552. https://doi.org/10.1016/b0-12-227240-4/00138-6

    Book  Google Scholar 

  2. Principles of von Neumann (von Neumann architecture), Planeta Informatiki, https://inf1.info/machineneumann. Cited May 26, 2022.

  3. Simonov, N.A., Spots concept for problems of artificial intelligence and algorithms of neuromorphic systems, Russ. Microelectron., 2020, vol. 49, no. 6, pp. 431–444. https://doi.org/10.1134/S106373972005008X

    Article  Google Scholar 

  4. Jeong, D.S., Kim, K.M., Kim, S., Choi, B.J., and Hwang, C.S., Memristors for energy-efficient new computing paradigms, Adv. Electron. Mater., 2016, vol. 2, no. 9, p. 1600090. https://doi.org/10.1002/aelm.201600090

    Article  Google Scholar 

  5. Mead, C., Neuromorphic electronic systems, Proc. IEEE, 1990, vol. 78, no. 10, pp. 1629–1636. https://doi.org/10.1109/5.58356

    Article  Google Scholar 

  6. Ivanov, D., Chezhegov, A., Kiselev, M., Grunin, A., and Larionov, D., Neuromorphic artificial intelligence systems, Front. Neurosci., 2022, vol. 16. https://doi.org/10.3389/fnins.2022.959626

  7. Sung, C., Hwang, H., and Yoo, I.K., Perspective: A review on memristive hardware for neuromorphic computation, J. Appl. Phys., 2018, vol. 124, no. 15, p. 124. https://doi.org/10.1063/1.5037835

    Article  Google Scholar 

  8. Feldman, D.E., The spike-timing dependence of plasticity, Neuron, 2012, vol. 75, no. 4, pp. 556–571. https://doi.org/10.1016/j.neuron.2012.08.001

    Article  Google Scholar 

  9. Gjorgjieva, J., Clopath, C., Audet, J., and Pfister, J.-P., A triplet spike-timing–dependent plasticity model generalizes the Bienenstock–Cooper–Munro rule to higher-order spatiotemporal correlations, Proc. Natl. Acad. Sci. U. S. A., 2011, vol. 108, no. 48, pp. 19383–19388. https://doi.org/10.1073/pnas.1105933108

    Article  Google Scholar 

  10. Rakitin, V.V. and Rusakov, S.G., Memristor based pulse train generator, Russ. Microelectron., 2019, vol. 48, no. 4, pp. 255–261. https://doi.org/10.1134/s1063739719040073

    Article  Google Scholar 

  11. Diehl, P.U., Neil, D., Binas, J., Cook, M., Liu, Sh.-Ch., and Pfeiffer, M., Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing, 2015 Int. Joint Conf. on Neural Networks (IJCNN), Killarney, Ireland, 2015, IEEE, 2015, pp. 1–8. https://doi.org/10.1109/ijcnn.2015.7280696

  12. Sengupta, A., Ye, Yu., Wang, R., Liu, C., and Roy, K., Going deeper in spiking neural networks: VGG and residual architectures, Front. Neurosci., 2019, vol. 13, p. 95. https://doi.org/10.3389/fnins.2019.00095

    Article  Google Scholar 

  13. De Souza Dias, C. and Butzen, P.F., Memristors: A journey from material engineering to beyond von-Neumann computing, J. Integr. Circuits Syst., 2021, vol. 16, no. 1, p. 479. https://doi.org/10.29292/jics.v16i1.479

    Article  Google Scholar 

  14. Sun, Z., Ambrosi, E., Bricalli, A., and Ielmini, D., Logic computing with stateful neural networks of resistive switches, Adv. Mater., 2018, vol. 30, no. 38, p. 1802554. https://doi.org/10.1002/adma.201802554

    Article  Google Scholar 

  15. Borghetti, J., Snider, G.S., Kuekes, P.J., Yang, J.J., Stewart, D.R., and Williams, R.S., ‘Memristive’ switches enable ‘stateful’ logic operations via material implication, Nature, 2010, vol. 464, no. 7290, pp. 873–876. https://doi.org/10.1038/nature08940

    Article  Google Scholar 

  16. Talati, N., Gupta, S., Mane, P., and Kvatinsky, S., Logic design within memristive memories using memristor-aided logic (MAGIC), IEEE Trans. Nanotechnol., 2016, vol. 15, no. 4, pp. 635–650. https://doi.org/10.1109/tnano.2016.2570248

    Article  Google Scholar 

  17. Kvatinsky, S., Belousov, D., Liman, S., Satat, G., Wald, N., Friedman, E.G., Kolodny, A., and Weiser, U., MAGIC—Memristor-aided logic, IEEE Trans. Circuits Syst. II: Express Briefs, 2014, vol. 61, no. 11, pp. 895–899. https://doi.org/10.1109/tcsii.2014.2357292

    Article  Google Scholar 

  18. Trepel, M., Neuroanatomie: Struktur und Funktion, Munich: Urban & Fischer, 2012, 5th ed.

    Google Scholar 

  19. Glagolev, S.M., Kak rabotaet neiron. Uchebnoe posobie (How Does a Neuron Function: Textbook), Moscow: Moskovskaya Gimnaziya na Yugo-Zapade, 1993.

  20. Dubynin, V.A., Mozg. Kak on ustroen i rabotaet. Konspekt lektsii (Brain: How Is It Structured and How Does It Function: Lecture Notes), Moscow: Mosk. Gos. Univ., 2018.

  21. Gladkov, A.A., Dynamics of induced activity of neural network in a culture of dissociated mouse hyppocampus cells at electric stimulation, Cand. Sci. (Biol.) Dissertation, Nizhny Novgorod: Lobachevsky State University of Nizhny Novgorod, 2018.

  22. Gafarov, F.M., Iskusstvennye neironnye seti i prilozheniya. Uchebnoe posobie (Artificial Neural Networks and Applications: Textbook), Gafarov, F.M. and Galimyanov, A.F., Eds., Kazan: Izd-vo Kazansk. Univ., 2018.

    Google Scholar 

  23. Tariq, R., Make Your Own Neural Network, CreateSpace, 2016.

    Google Scholar 

  24. Lanza, M., Wong, H.-S.P., Pop, E., et al., Recommended methods to study resistive switching devices, Adv. Electron. Mater., 2018, vol. 5, no. 1, p. 1800143.

    Article  Google Scholar 

  25. Permyakova, O.O. and Rogozhin, A.E., Simulation of resistive switching in memristor structures based on transition metal oxides, Russ. Microelectron., 2020, vol. 49, no. 5, pp. 303–313. https://doi.org/10.1134/S106373972004006X

    Article  Google Scholar 

  26. Valov, I., Waser, R., Jameson, J.R., and Kozicki, M.N., Electrochemical metallization memories—fundamentals, applications, prospects, Nanotechnology, 2011, vol. 22, no. 28, p. 289502. https://doi.org/10.1088/0957-4484/22/28/289502

    Article  Google Scholar 

  27. Lee, J. and Lu, W.D., On-demand reconfiguration of nanomaterials: When electronics meets ionics, Adv. Mater., 2018, vol. 30, no. 1, p. 1702770. https://doi.org/10.1002/adma.201702770

    Article  Google Scholar 

  28. Pan, F., Gao, S., Chen, C., Song, C., and Zeng, F., Recent progress in resistive random access memories: Materials, switching mechanisms, and performance, Mater. Sci. Eng., R, 2014, vol. 83, no. 83, pp. 1–59. https://doi.org/10.1016/j.mser.2014.06.002

    Article  Google Scholar 

  29. Noé, P., Vallée, C., Hippert, F., Fillot, F., and Raty, J.-Y., Phase-change materials for non-volatile memory devices: From technological challenges to materials science issues, Semicond. Sci. Technol., 2018, vol. 33, no. 1. https://doi.org/10.1088/1361-6641/aa7c25

  30. Tulina, N.A., Rossolenko, A.N., Shmytko, I.M., Ionov, A.M., Mozhchil, R.N., Borisenko, I.Yu., and Ivanov, A.A., Resistive switching in mesoscopic heterostructures based on Nd2–xCexCuO4–y epitaxial films, Russ. Microelectron., 2017, vol. 46, no. 3, pp. 180–185. https://doi.org/10.1134/S1063739717030106

    Article  Google Scholar 

  31. Eryilmaz, S.B., Kuzum, D., Jeyasingh, R., Kim, S., Brightsky, M., Lam, C., and Wong, H.-S.P., Brain-like associative learning using a nanoscale non-volatile phase change synaptic device array, Front. Neurosci., 2014, vol. 8, no. 8. https://doi.org/10.3389/fnins.2014.00205

  32. He, H.-K., Yang, R., Zhou, W., Huang, H.-M., Xiong, J., Gan, L., Zhai, T.-Yo., and Guo, X., Photonic potentiation and electric habituation in ultrathin memristive synapses based on monolayer MoS2, Small, 2018, vol. 14, no. 15, p. 1800079. https://doi.org/10.1002/smll.201800079

    Article  Google Scholar 

  33. Kim, D., Lu, H., Ryu, S., Bark, C.-W., Eom, C.-B., Tsymbal, E., and Gruverman, A., Ferroelectric tunnel memristor, Nano Lett., 2012, vol. 12, no. 11, pp. 5697–5702. https://doi.org/10.1021/nl302912t

    Article  Google Scholar 

  34. Kim, H.J., Baek, Yo.-J., Choi, Yo.J., Kang, Ch.J., Lee, H.H., Kim, H.-M., Kim, K.-B., and Yoon, T.-S., Digital versus analog resistive switching depending on the thickness of nickel oxide nanoparticle assembly, RSC Adv., 2013, vol. 3, no. 43, p. 20978. https://doi.org/10.1039/c3ra42683a

    Article  Google Scholar 

  35. Seo, K., Kim, I., Jung, S., Jo, M., Park, S., Park, J., Shin, J., Biju, K., Kong, J., Lee, K., Lee, B., and Hwang, H., Analog memory and spike-timing-dependent plasticity characteristics of a nanoscale titanium oxide bilayer resistive switching device, Nanotechnology, 2011, vol. 22, no. 25, p. 254023. https://doi.org/10.1088/0957-4484/22/25/254023

    Article  Google Scholar 

  36. Borghetti, J., Snider, G.S., Kuekes, P.J., Yang, J.J., Stewart, D.R., and Williams, R.S., ‘Memristive’ switches enable ‘stateful’ logic operations via material implication, Nature, 2010, vol. 464, no. 7290, pp. 873–876. https://doi.org/10.1038/nature08940

    Article  Google Scholar 

  37. Xu, N., Park, T., Yoon, K.-J., and Hwang, C., In-memory stateful logic computing using memristors: Gate, calculation, and application, Phys. Status Solidi RRL, 2021, vol. 15, no. 9, p. 2100208. https://doi.org/10.1002/pssr.202100208

    Article  Google Scholar 

  38. Wang, Z., Wu, H., Burr, G., Hwang, C., Wang, K., Xia, Q., and Yang, J., Resistive switching materials for information processing, Nat. Rev. Mater., 2020, vol. 5, no. 3, pp. 173–195. https://doi.org/10.1038/s41578-019-0159-3

    Article  Google Scholar 

  39. Wright, C.D., Hosseini, P., and Diosdado, J.A.V., Beyond von-Neumann computing with nanoscale phase-change memory devices, Adv. Funct. Mater., 2013, vol. 23, no. 18, pp. 2248–2254. https://doi.org/10.1002/adfm.201202383

    Article  Google Scholar 

  40. Hu, M., Li, H., Wu, Q., and Rose, G.S., Hardware realization of BSB recall function using memristor crossbar arrays, Proc. 49th Annu. Design Automation Conf., San Francisco, 2012, New York: Association for Computing Machinery, 2012, pp. 498–503. https://doi.org/10.1145/2228360.2228448

  41. Oh, S., Hwang, H., and Yoo, I.K., Ferroelectric materials for neuromorphic computing, APL Mater., 2019, vol. 7, no. 9, p. 091109. https://doi.org/10.1063/1.5108562

    Article  Google Scholar 

  42. Jerry, M., Chen, P., Zhang, J., Sharma, P., Ni, K., Yu, S., and Datta, S., Ferroelectric FET analog synapse for acceleration of deep neural network training, 2017 IEEE Int. Electron Devices Meeting (IEDM), San Francisco, 2017, IEEE, 2017, pp. 2–6. https://doi.org/10.1109/iedm.2017.8268338

  43. Saxena, V., Neuromorphic computing: From devices to integrated circuits, J. Vac. Sci. Technol., B, 2021, vol. 39, no. 1, p. 21. https://doi.org/10.1116/6.0000591

    Article  Google Scholar 

  44. Zhang, Ya., Wang, Z., Zhu, J., Yang, Yu., Rao, M., Song, W., Zhuo, Ye., Zhang, X., Cui, M., Shen, L., Huang, R., and Joshua Yang, J., Brain-inspired computing with memristors: Challenges in devices, circuits, and systems, Appl. Phys. Rev., 2020, vol. 7, no. 1, p. 011308. https://doi.org/10.1063/1.5124027

    Article  Google Scholar 

  45. Ahmed, T., Walia, S., Mayes, E.L.H., Ramanathan, R., Bansal, V., Bhaskaran, M., Sriram, S., and Kavehei, O., Time and rate dependent synaptic learning in neuro-mimicking resistive memories, Sci. Rep., 2019, vol. 9, no. 1, p. 11. https://doi.org/10.1038/s41598-019-51700-0

    Article  Google Scholar 

  46. Sathya, R. and Abraham, A., Comparison of supervised and unsupervised learning algorithms for pattern classification, Int. J. Adv. Res. Artif. Intell., 2013, vol. 2, no. 2, pp. 34–38. https://doi.org/10.14569/ijarai.2013.020206

    Article  Google Scholar 

  47. Kuzum, D., Yu, S., and Wong, H.-S.P., Synaptic electronics: Materials, devices and applications, Nanotechnology, 2013, vol. 24, no. 38, p. 382001. https://doi.org/10.1088/0957-4484/24/38/382001

    Article  Google Scholar 

  48. Wu, X., Saxena, V., and Zhu, K., A CMOS spiking neuron for dense memristor-synapse connectivity for brain-inspired computing, 2015 Int. Joint Conf. on Neural Networks (IJCNN), Killarney, Ireland, 2015, IEEE, 2015, pp. 1–6. https://doi.org/10.1109/IJCNN.2015.7280819

  49. Ielmini, D., Wang, Z., and Liu, Y., Brain-inspired computing via memory device physics, APL Mater., 2021, vol. 9, no. 5, p. 50702. https://doi.org/10.1063/5.0047641

    Article  Google Scholar 

  50. Hu, S.G., Wu, S.Y., Jia, W.W., Yu, Q., Deng, L.J., Fu, Y.Q., Liu, Y., and Chen, T.P., Review of nanostructured resistive switching memristor and its applications, Nanosci. Nanotechnol. Lett., 2014, vol. 6, no. 9, pp. 729–757. https://doi.org/10.1166/nnl.2014.1888

    Article  Google Scholar 

  51. Loihi–Intel, WikiChip, https://en.wikichip.org/wiki/ intel/loihi. Cited May 26, 2022.

  52. Davies, M., Srinivasa, N., Lin, Ts.-H., Chinya, G., Cao, Yo., Choday, S., Dimou, G., Joshi, P., Imam, N., Jain, S., Liao, Yu., Lin, C., Lines, A., Liu, R., Mathaikutty, D., Mccoy, S., Paul, A., Tse, J., Venkataramanan, G., Weng, Yi-H., Wild, A., Yang, Yo., and Wang, H., Loihi: A neuromorphic manycore processor with on-chip learning, IEEE Micro, 2018, vol. 38, no. 1, pp. 82–99. https://doi.org/10.1109/mm.2018.112130359

    Article  Google Scholar 

  53. Blouw, P., Choo, X., Hunsberger, E., and Eliasmith, C., Benchmarking keyword spotting efficiency on neuromorphic hardware, Proc. 7th Annu. Neuro-inspired Computational Elements Workshop, Waterloo, Canada, 2019, New York: Association for Computing Machinery, 2019, p. 1. https://doi.org/10.1145/3320288.3320304

  54. DeBole, M.V., Taba, B., Amir, A., Akopyan, F., Andreopoulos, A., Risk, W.P., Kusnitz, J., Otero, C.O., Nayak, T.K., Appuswamy, R., Carlson, P.J., Cassidy, A.S., Datta, P., Esser, S.K., Garreau, G.J., Holland, K.L., Lekuch, S., Mastro, M., McKinstry, J., di Nolfo, C., Paulovicks, B., Sawada, J., Schleupen, K., Shaw, B.G., Klamo, J.L., Flickner, M.D., Arthur, J.V., and Modha, D.S., TrueNorth: Accelerating from zero to 64 million neurons in 10 years, Computer, 2019, vol. 52, no. 5, pp. 20–29. https://doi.org/10.1109/MC.2019.2903009

    Article  Google Scholar 

  55. Merolla, P.A., Arthur, J.V., Alvarez-Icaza, R., Cassidy, A.S., Sawada, J., Akopyan, F., Jackson, B.L., Imam, N., Guo, C., Nakamura, Yu., Brezzo, B., Vo, I., Esser, S.K., Appuswamy, R., Taba, B., Amir, A., Flickner, M., Risk, W.P., Manohar, R., and Modha, D.S., A million spiking-neuron integrated circuit with a scalable communication network and interface, Science, 2014, vol. 345, no. 6197, pp. 668–673. https://doi.org/10.1126/science.1254642

    Article  Google Scholar 

  56. Ankit, A., El Hajj, I., Chalamalasetti, S.R., Ndu, G., Foltin, M., Williams, R.S., Faraboschi, P., Hwu, W.W., Strachan, J.P., Roy, K., and Milojicic, D.S., PUMA: A programmable ultra-efficient memristor-based accelerator for machine learning inference, Proc. Twenty-Fourth Int. Conf. on Architectural Support for Programming Languages and Operating Systems, Providence, R.I., 2019, New York: Association for Computing Machinery, 2019, pp. 715–731. https://doi.org/10.1145/3297858.3304049

  57. Baischer, L., Wess, M., and TaheriNejad, N., Learning on hardware: A tutorial on neural network accelerators and co-processors, 2021. https://doi.org/10.48550/arXiv.2104.09252

  58. Huang, X., Liu, C., Jiang, Yu.-G., and Zhou, P., In-memory computing to break the memory wall, Chin. Phys. B, 2020, vol. 29, no. 7, p. 078504. https://doi.org/10.1088/1674-1056/ab90e7

    Article  Google Scholar 

  59. Shi, L., Zheng, G., Tian, B., Dkhil, B., and Duan, C., Research progress on solutions to the sneak path issue in memristor crossbar arrays, Nanoscale Adv., 2020, vol. 2, no. 5, pp. 1811–1827. https://doi.org/10.1039/d0na00100g

    Article  Google Scholar 

  60. Huang, Ch.-H., Chou, T.-Sh., Huang, J.-Sh., Lin, Sh.-M., and Chueh, Yu.-L., Self-selecting resistive switching scheme using TiO2 nanorod arrays, Sci. Rep., 2017, vol. 7, no. 1, p. 2066. https://doi.org/10.1038/s41598-017-01354-7

    Article  Google Scholar 

  61. Geim, A.K. and Novoselov, K.S., The rise of graphene, Nanoscience and Technology, Rodgers, P., Ed., UK: World Scientific and Macmillan Publishers, 2009, pp. 11–19. https://doi.org/10.1142/9789814287005_0002

    Book  Google Scholar 

  62. Yao, P., Wu, H., Gao, B., Tang, J., Zhang, Q., Zhang, W., Yang, J., and Qian, H., Fully hardware-implemented memristor convolutional neural network, Nature, 2020, vol. 577, no. 7792, pp. 641–646. https://doi.org/10.1038/s41586-020-1942-4

    Article  Google Scholar 

  63. Cai, F., Correll, J.M., Lee, S.H., Lim, Yo., Bothra, V., Zhang, Z., Flynn, M.P., and Lu, W.D., A fully integrated reprogrammable memristor–CMOS system for efficient multiply–accumulate operations, Nat. Electron., 2019, vol. 2, no. 7, pp. 290–299. https://doi.org/10.1038/s41928-019-0270-x

    Article  Google Scholar 

Download references

Funding

This study was carried out as part of a state order for Valiev Institute of Physics and Technology, Russian Academy of Sciences of the Russian Ministry of Education and Science on topic no. FFNN-2022-0019.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to K. A. Fetisenkova or A. E. Rogozhin.

Ethics declarations

The authors declare that they have no conflicts of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fetisenkova, K.A., Rogozhin, A.E. Neuromorphic Systems: Devices, Architecture, and Algorithms. Russ Microelectron 52, 393–410 (2023). https://doi.org/10.1134/S1063739723700555

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S1063739723700555

Keywords:

Navigation