Abstract
This paper proposes a novel artificial neural network called fast learning network (FLN). In FLN, input weights and hidden layer biases are randomly generated, and the weight values of the connection between the output layer and the input layer and the weight values connecting the output node and the input nodes are analytically determined based on least squares methods. In order to test the FLN validity, it is applied to nine regression applications, and experimental results show that, compared with support vector machine, back propagation, extreme learning machine, the FLN with much more compact networks can achieve very good generalization performance and stability at a very fast training speed and a quick reaction of the trained network to new observations. In addition, in order to further test the FLN validity, it is applied to model the thermal efficiency and NO x emissions of a 330 WM coal-fired boiler and achieves very good prediction precision and generalization ability at a high learning speed.
Similar content being viewed by others
References
Green M, Ekelund U, Edenbrandt L, Björk J, Forberg JL, Ohlsson M (2009) Exploring new possibilities for case-based explanation of artificial neural network ensembles. Neural Netw 22:75–81
May RJ, Maier HR, Dandy GC (2010) Data splitting for artificial neural networks using SOM-based stratified sampling. Neural Netw 23:283–294
Kiranyaz S, Ince T, Yildirim A, Gabbouj M (2009) Evolutionary artificial neural networks by multi-dimensional particle swarm optimization. Neural Netw 22:1448–1462
Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501
Suresh S, Venkatesh Babu R, Kim HJ (2009) No-reference image quality assessment using modified extreme learning machine classifier. Appl Soft Comput 9:541–552
Li G, Niu P (2011) An enhanced extreme learning machine based on ridge regression for regression. Neural Comput Appl. doi:10.1007/s00521-011-0771-7
Romero E, Alquézar R (2012) Comparing error minimized extreme learning machines and support vector sequential feed-forward neural networks. Neural Netw 25:122–129
Zhu Q-Y, Qin AK, Suganthan PN, Huang G-B (2005) Evolutionary extreme learning machine. Pattern Recogn 38:1759–1763
Huynh HT, Won Y (2008) Small number of hidden units for ELM with two-stage linear model. IEICE Trans Inform Syst 91-D:1042–1049
He M (1993) Theory, application and related problems of double parallel feedforward neural networks. Ph.D. thesis, Xidian University, Xi’an
Wang J, Wu W, Li Z, Li L (2011) Convergence of gradient method for double parallel feedforward neural network. Int J Numer Anal Model 8:484–495
Tamura S, Tateishi M (1997) Capabilities of a four-layered feedforward neural network: four layers versus three. IEEE Trans Neural Netw 8:251–255
Huang G-B (1998) Learning capability of neural networks. Ph.D. thesis, Nanyang Technological University, Singapore
Huang G-B (2003) Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans Neural Netw 14:274–281
Huang G-B, Chen L, Siew C-K (2006) Universal approximation using incremental networks with random hidden computation nodes. IEEE Trans Neural Netw 17:879–892
Rao CR, Mitra SK (1971) Generalized inverse of matrices and its applications. Wiley, New York
Serre D (2002) Matrices: theory and applications. Springer, New York
Liang N-Y, Huang G-B, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17:1411–1423
Lan Y, Soh YC, Huang G-B (2010) Two-stage extreme learning machine for regression. Neurocomputing 73:3028–3038
Xu C, Lu J, Zheng Y (2006) An experiment and analysis for a boiler combustion optimization on efficiency and NO x emissions. Boil Technol 37:69–74
Acknowledgments
This project is supported by the National Natural Science Foundation of China (Grant No. 60774028) and Natural Science Foundation of Hebei Province, China (Grant No.F2010001318).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Li, G., Niu, P., Duan, X. et al. Fast learning network: a novel artificial neural network with a fast learning speed. Neural Comput & Applic 24, 1683–1695 (2014). https://doi.org/10.1007/s00521-013-1398-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-013-1398-7