Skip to main content

Advertisement

Log in

Optimization of Hopfield Neural Network (HNN) using multiconnection and Lyapunov Energy Function (LEF) for storage and recall of handwritten images

  • Published:
Sādhanā Aims and scope Submit manuscript

Abstract

The HNN is often used as a human-like associative memory (AM). Nonetheless, HNNs suffered from limited noise tolerance and limited storage capacity. However, unfortunately, Hopfield neural networks suffered from limited tolerance to noise and limited storage capacity. This study proposes a novel approach for overcoming these limitations using the concept of multiple connections architecture in a Hopfield neural network with an energy function and Hamming distance (HD). In this architecture, a single connection among neurons of the network is employed to store a pattern, resulting in an etalon array, and the collection of etalon arrays corresponding to each neuronal connection forms a weight matrix (also called connection matrix). This approach of storing a single pattern on a single set of connections reduces the chances of being trapped in local minima and increases storage capacity because more patterns are stored in the network as the number of connections increases. The Lyapunov energy function and the Hamming distance concept are utilized to determine the testing pattern’s proximity to one of the stored patterns for perfect recall. Several experiments are conducted to compare the recall success rate and storage capacity of handwritten character images to those of existing methods. Experiments using multiple connections Hopfield neural network with hamming distance and Lyapunov energy function demonstrate promising results in comparison to existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12

Similar content being viewed by others

References

  1. Hopfield J J 1982 Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences. 79(8): 2554–2558

    Article  MathSciNet  MATH  Google Scholar 

  2. Hopfield J J 1984 Neural Networks, Physical systems with emergent collective computational abilities. Proceedings of National Academy of Sciences USA 81: 3088–3092

    Article  MATH  Google Scholar 

  3. Hebb D 1949 The Organization of Behavior A Neuropsychological Theory. 1st edn. Wiley, New York

    Google Scholar 

  4. Hillar C, Dickstein J S and Koepsell K 2012 Efficient and optimal Little-Hopfield auto-associative memory storage using minimum probability flow.In : Neural Information Processing Systems (NIPS) Workshop on Discrete Optimization in Machine Learning (DISCML), 1–6

  5. Kumar S and Singh M P 2010 Pattern recall analysis of the Hopfield neural network with a genetic algorithm. Computers and Mathematics with Applications. 60(4): 1049–1057

    Article  MathSciNet  MATH  Google Scholar 

  6. Singh T P and Jabin S 2012 Evolving Connection Weights for Pattern Storage and Recall in Hopfield Model of Feedback Neural Networks using a Genetic Algorithm. International Journal on Soft Computing. 3(2): 55–62

    Article  Google Scholar 

  7. Kumar S and Singh M P 2012 Study of Hopfield neural network with sub-optimal and random GA for pattern recalling of English characters. Applied Soft Computing Journal. 12(8): 2593–2600

    Article  Google Scholar 

  8. Davey N, Hunt S P, Adams R G and Davey N 2004 High Capacity Recurrent Associative Memories. Neurocomputing-IJON. 62: 459–491

    Article  Google Scholar 

  9. Singh M P and Dixit R S 2013 Optimization of stochastic networks using simulated annealing for the storage and recalling of compressed images using SOM. Engineering Applications of Artificial Intelligence. 26(10): 2383–2396

    Article  Google Scholar 

  10. Rodriuez D L, Casermerio E M and Ortiz-de-Lazcano-Labato J M 2007 Hopfield-network-as-associative-memory-with-multiple-reference-points. International Journal of Mathematical and Computational Sciences. 1(7): 324–329

    Google Scholar 

  11. Dehghan M, Nourian M and Menhaj M B 2009 Numerical solution of Helmholtz equation by the modified Hopfield finite difference techniques. Numerical Methods for Partial Differential Equations. 25(3): 637–656

    Article  MathSciNet  MATH  Google Scholar 

  12. Wu Y D, Chen Y H and Zhang H Y 2005 An improved algorithm for image restoration based on modified Hopfield neural network. In: 5th International Conference on Machine Learning and Cybernetics, 4720–4723

  13. Gosti G, Folli V, Leonetti M and Ruocco G 2019 Beyond the maximum storage capacity limit in Hopfield recurrent neural networks. Entropy. 21(8): 1–12

    Article  MathSciNet  Google Scholar 

  14. Kim D H, Park J and Kahng B 2017 Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study. PLoS ONE. 12(10): 1–12

    Article  Google Scholar 

  15. Kobayashi M 2017 Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules. Computational Intelligence and Neuroscience. 2017: 1–6

    Article  Google Scholar 

  16. Rebentrost P, Bromley T R, Weedbrook C and Lloyd S 2018 Quantum Hopfield neural network. Physical. Review. A8: 1–13

    Google Scholar 

  17. Mutter K N, Kaream I I A and Moussa H A 2006 Gray Image Recognition using Hopfield Neural Network with Multi-Bitplane and Multi-Connect Architecture. In: International Conference on Computer Graphics, Imaging and Visualisation (CGIV’06), 236–242

  18. Mutter K N, Mat Z, Azlan J and Aziz A 2007 Hopfield Neural Network (HNN) improvement for Color Image Recognition Using Multi-Bitplane and Multi-Connect Architecture. In: IEEE International Conference on Computer Graphics, Imaging and Visualisation (CGIV’07), 403–407

  19. Kareem E A, Kareem A, Ali W A H and Jantan A 2012 MCA: A Developed Associative Memory Using Multi-Connect Architecture. Intelligent Automation and Soft Computing. 18(3): 291–308

    Article  Google Scholar 

  20. Yadav J K P S, Singh L and Jaffery Z A 2017 Comparative Analysis of Recurrent Networks for Pattern Storage and Recalling of Static Images. International Journal of Computer Applications. 170(10): 15–19

    Article  Google Scholar 

  21. Shaoo R C and Pradhan S K 2020 Pattern Storage and Recalling Analysis of Hopfield Network for Handwritten Odia Characters Using HOG. In: Advances in Machine Learning and Computational Intelligence (eds) Patnaik S, Yang X S and Sethi I, Springer, Singapore, pp 467–476

    Google Scholar 

  22. Goel R K, Vishnoi S and Shrivastava S 2019 Image denoising by hybridizing preprocessed discrete wavelet transformation and recurrent neural networks. International Journal of Innovative Technology and Exploring Engineering. 8(10): 3451–3457

    Article  Google Scholar 

  23. Kohonen T and Ruohonen M 1973 Representation of associated data by matrix operators. Institute of Electrical and Electronics Engineers Transactions on Computers. C22(7): 701–708

    Google Scholar 

  24. Haykin S 1998 Neural Networks: A Comprehensive Foundation. 2nd edn. Prentice Hall, Singapore

    MATH  Google Scholar 

  25. Amit D J, Gutfreund H and Sompolinsky H 1985 Storing Infinite Numbers of Patterns in a Spin-Glass Model of Neural Networks. Physical Review Letters 55(14): 1530–1533

    Article  Google Scholar 

  26. Abu-mostafa Y S and Jacques J S 1985 Information Capacity of the Hopfield Model. IEEE Transactions On Information Theory. IT-31(4): 461–464

    Article  MATH  Google Scholar 

  27. Abu-Mostafa Y S, Magdon-Ismail M and Lin H T 2012 Learning From Data. AMLBook, United States

  28. Yadav J K P S, Singh L and Jaffery Z A 2022 Optimization of Hopfield neural network for improved pattern recall and storage using Lyapunov energy function and hamming distance: MC-HNN. International Journal of Fuzzy System Applications (IJFSA). 11(2): 1–25

    Article  Google Scholar 

  29. Hamming R W 1950 Error detecting and error correcting codes. The Bell System Technical Journal. 29(2): 147–160

    Article  MathSciNet  MATH  Google Scholar 

  30. Yadav J K P S, Singh L and Jaffery Z A 2022 A Robust Automatic Fingerprint Recognition System Using Multi-Connection Hopfield Neural Network. Traitement du Signal 39(2): 683–694

    Article  Google Scholar 

  31. Singh L, Alam A, Kumar K V and Kumar D 2021 Kumar P and Jaffery Z A 2021 Design of thermal imaging-based health condition monitoring and early fault detection technique for porcelain insulators using machine learning. Environmental Technology & Innovation. 24: 102000

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to LAXMAN SINGH.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

YADAV, J.K.P.S., SINGH, L. & JAFFERY, Z.A. Optimization of Hopfield Neural Network (HNN) using multiconnection and Lyapunov Energy Function (LEF) for storage and recall of handwritten images. Sādhanā 48, 26 (2023). https://doi.org/10.1007/s12046-023-02083-6

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s12046-023-02083-6

Keywords

Navigation