Abstract
It is difficult for Extreme Learning Machine (ELM) to estimate the number of hidden nodes used to match with the learning data. In this paper, a novel pruning algorithm based on sensitivity analysis is proposed for ELM. The measure to estimate the necessary number of hidden layer nodes is presented according to the defined sensitivity. When the measure is below the given threshold, the nodes with smaller sensitivities are removed from the existent network all together. Experimental results show that the proposed method can produce more compact neural network than some other existing similar algorithms.
Keywords
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Huang, G.-B., Zhu, Q.-Y., Siew, C.-K.: Extreme learning machine: theory and applications. Neurocomputing (70), 489–501 (2006)
Huang, G.-B., Zhu, Q.-Y., Mao, K.Z., Siew, C.-K., Saratchandran, P., Sundararajan, N.: Can threshold networks be trained directly. IEEE Trans. Circuits Systems II 53(3), 187–191 (2006)
Huang, G.-B., Chen, L., Siew, C.-K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4), 879–892 (2006)
Huang, G.-B., Chen, L.: Enhanced random search based incremental extreme learning machine. Neurocomputing (71), 3460–3468 (2008)
Feng, G., Huang, G.-B., Lin, Q., Gay, R.: Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Networks 20(8), 1352–1357 (2009)
Teoh, E.J., Tan, K.C., Xiang, C.: Estimating the Number of Hidden Neurons in a Feedforward Network Using the Singular Value Decomposition. IEEE Trans Neural Networks 17(6), 1623–1629 (2006)
Huang, G., Babri, H.: Upper bounds on the number of hidden neu-rons in feedforward networks with arbitrary bounded nonlinear activa-tion functions. IEEE Trans Neural Networks 9(1), 224–229 (1998)
Huang, G.: Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans Neural Networks 14(2), 274–281 (2003)
Lecun, Y., Denker, J., Solla, S.: Optimal brain damage. Adv. Neural Inform Process. Syst. (2), 598–605 (1990)
Hassibi, B., Stork, D., Wolff, G.: Optimal brain surgeon and general network pruning. In: Proc. IEEE Int. Conf. Neural Networks, San Francisco, CA, vol. (1), pp. 293–299 (1992)
Zeng, X., Yeung, D.S.: Hidden neuron pruning of multilayer perceptrons using a quantified sensitivity measure. Neurocomputing (69), 825–837 (2006)
Rong, H.-J., Ong, Y.-S., Tan, A.-H., Zhu, Z.: A fast pruned-extreme learning machine for classification problem. Neurocomputing (72), 359–366 (2008)
Miche, Y., Sorjamaa, A., Bas, P., Simula, O., Jutten, C., Lendasse, A.: OP-ELM: optimally pruned extreme learning machine. IEEE Trans Neural Networks 21(1), 158–162 (2010)
UC Irvine Machine Learning Repository, http://archive.ics.uci.edu/ml/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ying, L., Fan-jun, L. (2013). A Pruning Algorithm for Extreme Learning Machine. In: Yin, H., et al. Intelligent Data Engineering and Automated Learning – IDEAL 2013. IDEAL 2013. Lecture Notes in Computer Science, vol 8206. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41278-3_1
Download citation
DOI: https://doi.org/10.1007/978-3-642-41278-3_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-41277-6
Online ISBN: 978-3-642-41278-3
eBook Packages: Computer ScienceComputer Science (R0)