Hopfield Recurrent Neural Networks
- 317 Downloads
In 1982, Hopfield proposed a model of neural networks , which used two-state threshold “neurons” that followed a stochastic algorithm. This model explored the ability of a network of highly interconnected “neurons” to have useful collective computational properties, such as content addressable memory. However, the model is based on McCulloch-Pitts neurons that are different from real biological neurons and also from the realistic functioning of simple electric circuits. Real neurons have continuous input-output relations and integrative time delays due to capacitance. To overcome such problems, in 1984, Hopfield proposed another continuous time recurrent neural network model with a graded response. It is described by a set of differential equations. This deterministic system has collective properties very close to the earlier stochastic model. Today, this model is well known as the Hopfield model of RNNs and it has found wide applications in various optimisation problems [65,22,107, 182], associative memories, engineering problems, satellite broadcast scheduling problems [64,4], graph partition , stereo vision , multiuser detector , fault detection and isolation , affine invariant matching , pattern sequence recognition , classification , etc. The contributions of Hopfield RNN model to the field of neural networks cannot be over-emphasised. In fact, it is the outstanding work of Hopfield that has rekindled research interests in the neural networks from both scientists and engineers.
KeywordsConvergence Analysis Recurrent Neural Network Global Asymptotic Stability Exponential Convergence Multiuser Detector
Unable to display preview. Download preview PDF.