Advertisement

Neuromorphic Hardware Using Simplified Elements and Thin-Film Semiconductor Devices as Synapse Elements - Simulation of Hopfield and Cellular Neural Network -

  • Tomoya Kameda
  • Mutsumi Kimura
  • Yasuhiko Nakashima
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10639)

Abstract

Neuromorphic hardware using simplified elements and thin-film semiconductor devices as synapse elements is proposed. It is assumed that amorphous metal-oxide semiconductor devices are used for the synapse elements, and the characteristic degradation is utilized for the learning rule named modified Hebbian learning. First, we explain an architecture and operation of a Hopfield neural network. Next, we model the electrical characteristic of the thin-film semiconductor devices and simulate the letter recognition by the neural network. Particularly in this presentation, we show a degradation map. On the other hand, we also explain an architecture and operation of a cellular neural network, model the thin-film semiconductor devices, and simulate the letter recognition. Particularly in this presentation, we evaluate connection schemes. It is found that the cellular neural network has higher performance when it has diagonal connections. Moreover, we compare the Hopfield and cellular neural networks. It is found that the Hopfield neural network has higher performance, although the cellular neural network has a simple structure.

Keywords

Neuromorphic hardware Thin-film semiconductor device Synapse element Hopfield neural network Cellular neural network 

Notes

Acknowledgement

We would like to thank Prof. Mamoru Furuta of Kochi University of Technology, Prof. Toshio Kamiya of Tokyo Institute of Technology, KAKENHI 16K06733, Laboratory for Materials and Structures of Tokyo Institute of Technology, ROHM Semiconductor, Yazaki Memorial Foundation for Science and Technology, Support Center for Advanced Telecommunications Technology Research Foundation, and KOA Corporation.

References

  1. 1.
    Dayhoff, J.E.: Neural Network Architectures, An Introduction. Van Nostrand Reinhold, New York (1990)Google Scholar
  2. 2.
    Hecht-Nielsen, R.: Neurocomputing. Addison-Wesley Reading, Boston (1990)Google Scholar
  3. 3.
    Becker, S., Hinton, G.E.: Self-organizing neural network that discovers surfaces in random-dot stereograms. Nature 355, 161–163 (1992)CrossRefGoogle Scholar
  4. 4.
    Stone, J.V., Hunkin, N.M., Hornby, A.: Neural-network models: predicting spontaneous recovery of memory. Nature 414, 167–168 (2001)CrossRefGoogle Scholar
  5. 5.
    Kasakawa, T., Tabata, H., Onodera, R., Kojima, H., Kimura, M., Hara, H., Inoue, S.: An artificial neural network at device level using simplified architecture and thin-film transistors. IEEE Trans. Electron Devices 57, 2744–2750 (2010)CrossRefGoogle Scholar
  6. 6.
    Kimura, M., Miyatani, T., Fujita, Y., Kasakawa, T.: Apoptotic self-organized electronic device using thin-film transistors for artificial neural networks with unsupervised learning functions. Jpn. J. Appl. Phys. 54, 03CB02 (2015)CrossRefGoogle Scholar
  7. 7.
    Kimura, M., Fujita, Y., Kasakawa, T., Matsuda, T.: Novel architecture for cellular neural network suitable for high-density integration of electron devices-learning of multiple logics. In: Arik, S., Huang, T., Lai, W.K., Liu, Q. (eds.) ICONIP 2015. LNCS, vol. 9489, pp. 12–20. Springer, Cham (2015). doi: 10.1007/978-3-319-26532-2_2 CrossRefGoogle Scholar
  8. 8.
    Kimura, M., Nakamura, N., Yokoyama, T., Matsuda, T., Kameda, T., Nakashima, Y.: Simplification of processing elements in cellular neural networks. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9948, pp. 309–317. Springer, Cham (2016). doi: 10.1007/978-3-319-46672-9_35 CrossRefGoogle Scholar
  9. 9.
    Kimura, M., Morita, R., Sugisaki, S., Matsuda, T., Kameda, T., Nakashima, Y.: Cellular neural network formed by simplified processing elements composed of thin-film transistors. Neurocomputing 248, 112–119 (2017)CrossRefGoogle Scholar
  10. 10.
    Kimura, M., Matsuda, T.: Neuromorphic application of oxide semiconductors. ECS Trans. 79, 169–175 (2017)CrossRefGoogle Scholar
  11. 11.
    Kimura, M., Nakanishi, H., Nakamura, N., Yokoyama, T., Matsuda, T., Kameda, T., Nakashima, Y.: Simplification of processing elements in cellular neural network. J. Electrical Engineering and Electronic Technology (to be published)Google Scholar
  12. 12.
    Kameda, T., Kimura, M., Nakashima, Y.: Letter reproduction simulator for hardware design of cellular neural network using thin-film synapses. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9948, pp. 342–350. Springer, Cham (2016). doi: 10.1007/978-3-319-46672-9_39 CrossRefGoogle Scholar
  13. 13.
    Kameda, T., Kimura, M., Nakashima, Y.: Letter reproduction simulator for hardware design of cellular neural network using thin-film synapses. In: 2016 International Symposium on Nonlinear Theory and its Applications (NOLTA 2016), pp. 40–43. NOLTA, IEICE, Tokyo (2016)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Tomoya Kameda
    • 1
  • Mutsumi Kimura
    • 1
    • 2
  • Yasuhiko Nakashima
    • 1
    • 2
  1. 1.Nara Institute of Science and TechnologyIkomaJapan
  2. 2.Ryukoku UniversityOtsuJapan

Personalised recommendations