Abstract
Neuromorphic computing systems are faster and more energy efficient compared to von Neumann computing architectures because of their ability to emulate biological systems. However, hardware implementations of neural networks require large amounts of area and memory and have limitations on scalability. With their highly parallelized computing capabilities and high integration densities, memristors are an energy-efficient solution to this problem. Due to their similarities with biological synapses, memristors have gained immense attention as building blocks in neuromorphic systems. A memristor is a two-terminal nonvolatile memory device that operates like a variable resistor and can be used to form large crossbar arrays that are capable of performing data-intensive calculations and in-memory computations. This makes memristors a promising candidate for brain-inspired computing. Many such emerging nonvolatile memory (eNVM) technologies have been developed over the years that include phase-change random access memory (PCRAM), magnetic random access memory (MRAM), and resistive random access memory (ReRAM). ReRAM, which is also referred to as a memristor, is a filament-type device that changes its resistance based on the voltage applied and has been immensely researched by many companies and research groups. The goal of this chapter is to deliver the merits and characteristics of ReRAM devices. This chapter will initially provide a brief overview of the working mechanism of a memristor and discuss how ReRAMs use the switching process of memristors. The chapter will dive more into detail on the construction of ReRAM crossbar and the read and write operations. Furthermore, the recent advancements in ReRAM on the area of neuromorphic computing will be covered including its developments in multilayer perceptron (MLP), spiking neural networks (SNN), convolutional neural networks (CNN), and recurrent neural networks (RNN). This topic will be useful in providing information on future prospects of using ReRAM devices in the field of neuromorphic computing. Through this chapter, readers will be able to gain a deeper understanding about the future of ReRAM and how they pave the way for deep learning accelerators and edge devices. With the use of these ReRAM-based architectures, the issue of latency, power consumption, and memory bottleneck can be mitigated, facilitating for brain-inspired computing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Haron, N.Z., Hamdioui, S.: Why is CMOS scaling coming to an END? In: 2008 3rd International Design and Test Workshop, pp. 98–103 (2008). https://doi.org/10.1109/IDT.2008.4802475
Gubbi, J., Buyya, R., Marusic, S., Palaniswami, M.: Internet of Things (IoT): a vision architectural elements and future directions. Future Gener. Comput. Syst. 29(7), 1645–1660 (2013)
Yocam, E.W.: Evolution on the network edge: intelligent devices. IT Professional. 5(2), 32–36 (2003). https://doi.org/10.1109/MITP.2003.1191790
Li, C., et al.: Analogue signal and image processing with large memristor crossbars. Nat. Electron. 1(1), 52–59 (2018)
Chua, L.: Memristor-the missing circuit element. IEEE Trans. Circuits Theory. 18(5), 507–519 (1971). https://doi.org/10.1109/TCT.1971.1083337
Backus, J.: Can programming be liberated from the Von Neumann style? A functional style and its algebra of programs. Commun. ACM. 21, 613–641 (1978)
Wong, H.-S.P., et al.: Metal–oxide RRAM. Proc. IEEE. 100(6), 1951–1970 (2012)
Upadhyay, N.K., Jiang, H., Wang, Z., Asapu, S., Xia, Q., Yang, J.J.: Emerging memory devices for neuromorphic computing. Adv. Mater. Technol. 4(4) (2019)
Yu, S., Chen, P.: Emerging memory technologies: recent trends and prospects. IEEE Solid-State Circuits Mag. 8(2), 43–56 (2016). https://doi.org/10.1109/MSSC.2016.2546199
Xie, Y., Zhao, J.: Emerging memory technologies. IEEE Micro. 39(1), 6–7 (2019). https://doi.org/10.1109/MM.2019.2892165
Park, J.: Neuromorphic computing using emerging synaptic devices: a retrospective summary and an outlook. Electronics. 9(9), 1414 (2020)
Keshmiri, V.: A Study of the Memristor Models and Applications (2014)
Strukov, D.B., Snider, G.S., Stewart, D.R., Williams, R.S.: The missing memristor found. Nature. 453(7191), 80–83 (2008)
Williams, S.R.: How we found the missing memristor. Spectrum, IEEE. 45(12), 28–35 (2008)
Gerstner, W., Kistler, W.M.: Spiking Neuron Models. Cambridge Univ. Press, Cambridge (2002)
Moore, S.: Memristor breakthrough: first single device to act like a neuron. IEEE Spectrum. (2020)
Mehonic, A., Kenyon, A.J.: Emulating the electrical activity of the neuron using a silicon oxide RRAM cell. Front. Neurosci. 10, 57 (2016)
Babacan, Y., Kaçar, F., Gürkan, K.: A spiking and bursting neuron circuit based on memristor. Neurocomputing. 203, 86–91 (2016)
Nakada, K.: Neural pulse coding using ReRAM-based neuron devices. IEICE Tech. Rep. 117(415), 63–68 (2018)
Kumar, S., Williams, R.S., Wang, Z.: Third-order nanocircuit elements for neuromorphic engineering. Nature. 585(3474), 518–523 (2020)
Zhirnov, L., Cavin, R., Gammaitoni, L.: Minimum energy of computing fundamental considerations. In: ICT-Energy-Concepts Towards Zero-Power Info. and Commun. Technology, vol. 7, (2014)
Mead, C.: Neuromorphic electronic systems. Proc. IEEE. 78(10), 1629–1636 (1990)
Walczak, S., Narciso, C.: Artificial neural networks. In: Encyclopedia of Physical Science and Technology, 3rd edn, pp. 631–645 (2003)
Huang, A., et al.: Memristor neural network design. In: Memristor and Memristive Neural Networks, pp. 1–35 (2018)
Shevgoor, M., Muralimanohar, N., Balasubramonian, R., Jeon, Y.: Improving memristor memory with sneak current sharing. In: 2015 33rd IEEE International Conference on Computer Design (ICCD), pp. 549–556 (2015)
Camunas-Mesa, L.A., Linares-Barranco, B., Serrano-Gotarredona, T.: Neuromorphic spiking neural networks and their memristor-CMOS hardware implementations. Materials. 12(17) (2019)
Chen, Y.-C., Lin, C.-C., Hu, S.-T., Lin, C.-Y., Fowler, B., Lee, J.: A novel resistive switching identification method through relaxation characteristics for sneak-path-constrained selectorless RRAM application. Sci. Rep. 9(1), 1–6 (2019)
Likharev, K.K., Strukov, D.B.: Introducing Molecular Electronics. Springer-Verlag, New York (2004)
Kim, K., et al.: A functional hybrid memristor crossbar-array/CMOS system for data storage and neuromorphic applications. Nano Lett. 12(1), 389–395 (2011)
Li, C., Han, L., Jiang, H., Jang, M.-H., Lin, P., Wu, Q., et al.: Three-dimensional crossbar arrays of self-rectifying Si/SiO2/Si memristors. Nat. Commun. 8, 719–813 (2017)
Likharev, K.K.: CrossNets: neuromorphic hybrid CMOS/nanoelectronic networks. Sci. Adv. Mater. 3(3), 322–331 (2011)
Ghosh-Dastidar, S., Adeli, H.: Spiking neural networks. Int. J. Neural Syst. 2009, 295–308 (2009)
Wu, Y., Deng, L., Li, G., Zhu, J., Shi, L.: Spatio-temporal backpropagation for training high-performance spiking neural networks. arXiv preprint arXiv:1706.02609. (2017)
W. Maass, “Networks of spiking neurons: the third generation of neural network models,” 1997.
Fouda, M., Kurdahi, F., Eltawil, A., Neftci, E.: Spiking neural networks for inference and learning: a memristor-based design perspective. arXiv preprint arXiv:1909.01771. (2019)
Diehl, P.U., Neil, D., Binas, J., Cook, M., Liu, S.-C., Pfeiffer, M.: Fast-classifying high-accuracy spiking deep networks through weight and threshold balancing. Proc. Int. Joint Conf. Neural Netw. 2015, 2933–2940 (2015)
Rueckauer, B., Lungu, I.-A., Hu, Y., Pfeiffer, M., Liu, S.-C.: Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11, 682 (2017)
Zhao, C., Wysocki, B.T., Liu, Y., Thiem, C.D., McDonald, N.R., Yi, Y.: Spike-time-dependent encoding for neuromorphic processors. ACM J. Emerg. Technol. Comput. Syst. 12(3), 23–46 (2015)
Yu, Q., Tang, H., Tan, K.C., Yu, H.: A brain-inspired spiking neural network model with temporal encoding and learning. Neurocomputing. 138, 3–13 (2014)
Iakymchuk, T., Rosado-Muñoz, A., Guerrero-Martínez, J.F., Bataller-Mompeán, M., Francés-Víllora, J.V.: Simplified spiking neural network architecture and STDP learning algorithm applied to image classification. EURASIP J. Image Video Process. 2015(1), 4 (2015)
Shuai, Y., Pan, X., Sun, X.: Spike-timing-dependent plasticity in memristors. In: Memristor and memristive neural networks. IntechOpen, London (2017. [Online]. Available: https://www.intechopen.com/chapters/56763). https://doi.org/10.5772/intechopen.69535
Frohlich, F.: Network Neuroscience. Academic Press, Cambridge, USA (2016)
Seo, K., Kim, I., Jung, S., Jo, M., Park, S., Park, J., et al.: Analog memory and spike-timing-dependent plasticity characteristics of a nanoscale titanium oxide bilayer resistive switching device. Nanotechnology. 22, 254023 (2011)
Tan, Z.-H., Yang, R., Terabe, K., Yin, X.-B., Zhang, X.-D., Guo, X.: Synaptic metaplasticity realized in oxide memristive devices. Adv. Mater. 28(2), 377–384 (2015)
Prezioso, M., Merrikh-Bayat, F., Hoskins, B., Likharev, K., Strukov, D.: Self-adaptive spike-time-dependent plasticity of metal-oxide memristors. arXiv Preprint arXiv:1505.05549. (2015)
Hsieh, C.-C., et al.: A sub-1-volt analog metal oxide memristive-based synaptic device with large conductance change for energy-efficient spike-based computing systems. Appl. Phys. Lett. 109(22), 223501 (2016)
Kim, S., Choi, S., Lu, W.: Comprehensive physical model of dynamic resistive switching in an oxide memristor. ACS Nano. 8(3), 2369–2376 (2014)
Matveyev, Y., et al.: Crossbar nanoscale HfO2-based electronic synapses. Nanoscale Res. Lett. 11(1), Dec (2016)
Yan, X., et al.: Memristor with Ag-cluster-doped TiO2 films as artificial synapse for neuroinspired computing. Adv. Funct. Mater. 28(1), 1705320 (2017)
Nowshin, F.: Spiking neural network with memristive based computing-in-memory circuits and architecture. M.S. Thesis, Bradley Department of Electrical and Computer Engineering, Virginia Tech, VA (2019)
F. Nowshin, Y. Yi, “Memristor-based deep spiking neural network with a computing-in-memory architecture”, n 2022 23rd International Symposium on Quality Electronic Design (ISQED), pp. 1-6. IEEE, 2022
Zhao, Z., et al.: A memristor-based spiking neural network with high scalability and learning efficiency. IEEE Trans. Circuits Syst. II Exp. Briefs. 67(5), 931–935 (2020)
Kamencay, P., Benco, M., Mizdos, T., Radil, R.: A new method for face recognition using convolutional neural network. Digit. Image Process. Comput. Graph. 15(4), 664–672 (2017)
Albawi, S., Mohammed, T.A., Al-Zawi, S.: Understanding of a convolutional neural network. In: Engineering and Technology (ICET) 2017 International Conference on, pp. 1–6. IEEE (2017)
Shafiee, A., et al.: ISAAC: a convolutional neural network accelerator with in-situ analog arithmetic in crossbars. In: Proc. ISCA, pp. 14–26 (2016)
Song, L., Qian, X., Li, H., Chen, Y.: PipeLayer: a pipelined ReRAM-based accelerator for deep learning. In: 2017 IEEE International Symposium on High Performance Computer Architecture (HPCA), pp. 541–552 (2017). https://doi.org/10.1109/HPCA.2017.55
Qiao, X., et al.: Atomlayer: a universal reram-based cnn accelerator with atomic layer computation. In: DAC (2018)
Schmiduber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)
Long, Y., Na, T., Mukhopadhyay, S.: ReRAM-based processing-in-memory architecture for recurrent neural network acceleration. IEEE Trans. Very Large Scale Integr. VLSI Syst. 26(12), 2781–2794 (2018). https://doi.org/10.1109/TVLSI.2018.2819190
Long, Y., Jung, E.M., Kung, J., Mukhopadhyay, S.: ReRAM crossbar based recurrent neural network for human activity detection. In: 2016 International Joint Conference on Neural Networks (IJCNN), pp. 939–946 (2016). https://doi.org/10.1109/IJCNN.2016.7727299
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Nowshin, F., Yi, Y. (2023). ReRAM-Based Neuromorphic Computing. In: Iranmanesh, A. (eds) Frontiers of Quality Electronic Design (QED). Springer, Cham. https://doi.org/10.1007/978-3-031-16344-9_2
Download citation
DOI: https://doi.org/10.1007/978-3-031-16344-9_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-16343-2
Online ISBN: 978-3-031-16344-9
eBook Packages: Computer ScienceComputer Science (R0)