Abstract
Weight initialization is important factor for determining the speed of training in feedforward neural networks. In this paper, a variable parameter \(\alpha \) is identified through statistical analysis in Nguyen–Widrow weight initialization method. The value of \(\alpha \) is varied from 1 to 10 and is tested on nine function approximation tasks. The results are compared for each value of \(\alpha \) using single-tail t-test. An optimal value of \(\alpha \) is derived, and a new weight initialization technique is hence proposed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bottou, L.: Reconnaissance de la parole par reseaux multi-couches. In: Proceedings of the International Workshop Neural Networks Application, Neuro-Nimes, vol. 88, pp. 197–217 (1988)
Breiman, L.: The ii method for estimating multivariate functions from noisy data. Technometrics 33(2), 125–143 (1991)
Chandra, P., Singh, Y.: Feedforward sigmoidal networks-equicontinuity and fault-tolerance properties. IEEE Trans. Neural Netw. 15(6), 1350–1366 (2004)
Chen, C.L., Nutter, R.S.: Improving the training speed of three-layer feedforward neural nets by optimal estimation of the initial weights. In: 1991 IEEE International Joint Conference on Neural Networks, 1991. pp. 2063–2068. IEEE (1991)
Cherkassky, V., Gehring, D., Mulier, F.: Comparison of adaptive methods for function estimation from samples. IEEE Trans. Neural Netw. 7(4), 969–984 (1996)
Cherkassky, V., Mulier, F.M.: Learning from Data: Concepts, Theory, and Methods. Wiley (2007)
Cybenko, G.: Approximation by superpositions of a sigmoidal function. Math. Control., Signals Syst. 2(4), 303–314 (1989)
Drago, G.P., Ridella, S.: Statistically controlled activation weight initialization (scawi). IEEE Trans. Neural Netw. 3(4), 627–631 (1992)
Funahashi, K.I.: On the approximate realization of continuous mappings by neural networks. Neural Netw. 2(3), 183–192 (1989)
Hagan, M.T., Menhaj, M.B.: Training feedforward networks with the marquardt algorithm. IEEE Trans. Neural Netw. 5(6), 989–993 (1994)
Haykin, S., Network, N.: A comprehensive foundation. Neural Netw. 2, 2004 (2004)
Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)
Irie, B., Miyake, S.: Capabilities of three-layered perceptrons. In: IEEE International Conference on Neural Networks, vol.1, p. 218 (1988)
Jones, L.K.: Constructive approximations for neural networks by sigmoidal functions. Proc. IEEE 78(10), 1586–1589 (1990)
Kim, Y., Ra, J.: Weight value initialization for improving training speed in the backpropagation network. In: 1991 IEEE International Joint Conference on Neural Networks, 1991. pp. 2396–2401. IEEE (1991)
Maechler, M., Martin, D., Schimert, J., Csoppenszky, M., Hwang, J.: Projection pursuit learning networks for regression. In: Proceedings of the 2nd International IEEE Conference on Tools for Artificial Intelligence, 1990, pp. 350–358. IEEE (1990)
Mittal, A., Chandra, P., Singh, A.P.: A statistically resilient method of weight initialization for sfann. In: 2015 International Conference on Advances in Computing, Communications and Informatics (ICACCI), pp. 1371–1376. IEEE (2015)
Nguyen, D., Widrow, B.: Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. In: 1990 IJCNN International Joint Conference on Neural Networks, 1990, pp. 21–26. IEEE (1990)
Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: The rprop algorithm. In: IEEE International Conference On Neural Networks, 1993, pp. 586–591. IEEE (1993)
Sodhi, S.S., Chandra, P.: A partially deterministic weight initialization method for sffanns. In: 2014 IEEE International Advance Computing Conference (IACC), pp. 1275–1280. IEEE (2014)
Sodhi, S.S., Chandra, P., Tanwar, S.: A new weight initialization method for sigmoidal feedforward artificial neural networks. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 291–298. IEEE (2014)
Yam, J.Y., Chow, T.W.: A weight initialization method for improving training speed in feedforward neural network. Neurocomputing 30(1), 219–232 (2000)
Acknowledgements
This publication is an outcome of the R&D work undertaken project under the Visvesvaraya PhD Scheme of Ministry of Electronics & Information Technology, Government of India, being implemented by Digital India Corporation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Mittal, A., Singh, A.P., Chandra, P. (2020). A Modification to the Nguyen–Widrow Weight Initialization Method. In: Thampi, S., et al. Intelligent Systems, Technologies and Applications. Advances in Intelligent Systems and Computing, vol 910. Springer, Singapore. https://doi.org/10.1007/978-981-13-6095-4_11
Download citation
DOI: https://doi.org/10.1007/978-981-13-6095-4_11
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-6094-7
Online ISBN: 978-981-13-6095-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)