An Enhanced Training Algorithm for Multilayer Neural Networks Based on Reference Output of Hidden Layer
- 63 Downloads
In this paper, the authors propose a new training algorithm which does not only rely upon the training samples, but also depends upon the output of the hidden layer. We adjust both the connecting weights and outputs of the hidden layer based on Least Square Backpropagation (LSB) algorithm. A set of ‘required’ outputs of the hidden layer is added to the input sets through a feedback path to accelerate the convergence speed. The numerical simulation results have demonstrated that the algorithm is better than conventional BP, Quasi-Newton BFGS (an alternative to the conjugate gradient methods for fast optimisation) and LSB algorithms in terms of convergence speed and training error. The proposed method does not suffer from the drawback of the LSB algorithm, for which the training error cannot be further reduced after three iterations.
Unable to display preview. Download preview PDF.