Skip to main content

Hopfield Associative Memory with Quantized Weights

  • Conference paper
  • First Online:
Advances in Neural Computation, Machine Learning, and Cognitive Research II (NEUROINFORMATICS 2018)

Part of the book series: Studies in Computational Intelligence ((SCI,volume 799))

Included in the following conference series:

Abstract

The use of binary and multilevel memristors in the hardware neural networks implementation necessitates their weight coefficients quantization. In this paper we investigate the Hopfield network weights quantization influence on its information capacity and resistance to input data distortions. It is shown that, for a weight level number of the order of tens, the quantized weights Hopfield-Hebb network capacitance approximates its continuous weights version capacity. For a Hopfield projection network, similar result can be achieved only for a weight levels number of the order of hundreds. Experiments have shown that: (1) binary memristors should be used in Hopfield-Hebb networks, reduced by zeroing all weights in a given row which moduli are strictly less than the maximum weight in the row; (2) in the Hopfield projection networks with quantized weights, multilevel memristors with a weight levels number significantly more than two should be used, with a specific levels number depending on the stored reference vectors dimension, their particular set and the permissible input data noise level.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall, Inc., Upper Saddle River (1999)

    MATH  Google Scholar 

  2. Hopfield, J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79, 2554–2558 (1982)

    Article  MathSciNet  Google Scholar 

  3. Personnaz, L., Guyon, I., Dreyfus, G.: Collective computational properties of neural networks: new learning mechanisms. Phys. Rev. A 34(5), 4217–4227 (1986)

    Article  MathSciNet  Google Scholar 

  4. Michel, A.N., Liu, D.: Qualitative Analysis and Synthesis of Recurrent Neural Networks. Marcel Dekker Inc., New York (2002)

    MATH  Google Scholar 

  5. Chua, L.: Memristor – the missing circuit element. IEEE Trans. Circuit Theory 18, 507–519 (1971)

    Article  Google Scholar 

  6. Strukov, D.B., Snider, G.S., Stewart, D.R., Williams, R.S.: The missing memristor found. Nature 453, 80–83 (2008)

    Article  Google Scholar 

  7. He, W., Sun, H., Zhou, Y., Lu, K., Xue, K., Miao, X.: Customized binary and multi-level HfO2 − x-based memristors tuned by oxidation conditions. Sci. Rep. 7, 10070 (2017)

    Article  Google Scholar 

  8. Yu, S., Gao, B., Fang, Z., Yu, H., Kang, J., Wong, H.-S.P.: A low energy oxide-based electronic synaptic device for neuromorphic visual systems with tolerance to device variation. Adv. Mater. 25, 1774–1779 (2013)

    Article  Google Scholar 

  9. Tarkov, M.S.: Crossbar-based hamming associative memory with binary memristors. In: Huang, T., Lv, J., Sun, C., Tuzikov, A.V. (eds.) ISNN 2018. LNCS, vol. 10878, pp. 380–387. Springer, Cham (2018). https://link.springer.com/chapter/10.1007/978-3-319-92537-0_44. Accessed 12 July 2018

    Chapter  Google Scholar 

  10. Tarkov, M.S.: Synapses reduction in autoassociative Hopfield network. In: 2017 International Multi-conference on Engineering, Computer and Information Sciences (SIBIRCON). Novosibirsk, Russia, 18–22 September, pp. 158–160. IEEE (2017). https://ieeexplore.ieee.org/document/8109860/. Accessed 12 July 2018

  11. Folli, V., Leonetti, M., Ruocco1, G.: On the maximum storage capacity of the Hopfield model. Front. Comput. Neurosci. 10, Article 144 (2017). https://www.frontiersin.org/articles/10.3389/fncom.2016.00144/full. Accessed 12 July 2018

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mikhail S. Tarkov .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tarkov, M.S. (2019). Hopfield Associative Memory with Quantized Weights. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V., Tiumentsev, Y. (eds) Advances in Neural Computation, Machine Learning, and Cognitive Research II. NEUROINFORMATICS 2018. Studies in Computational Intelligence, vol 799. Springer, Cham. https://doi.org/10.1007/978-3-030-01328-8_8

Download citation

Publish with us

Policies and ethics