Abstract
Weighted extreme learning machines (WELMs) aim to find the better tradeoff between empirical and structural risks, so they obtain the good generalization performances, especially when using them to deal with the imbalance classification problems. The existing weighting strategies assign the distribution-independent weight matrices for WELMs, i.e., the weights do not consider the probabilistic information of samples. This causes that WELM strengthens the affect of outliers to some extent. In this paper, a naive Bayesian based WELM (NBWELM) is proposed, in which the weight is determined with the flexible naive Bayesian (FNB) classifier. Through calculating the posterior probability of sample, NBWELM cannot only handle the outliers effectively but also consider two different weighting information i.e., the training error in weighted regularized ELM (WRELM) and class distribution in Zong et al.’s WELM (ZWELM), synchronously. The experimental results on 45 KEEL and UCI datasets show that our proposed NBWELM can further improve the generalization capability of WELM and thus obtain a higher classification accuracy than WRELM and ZWELM. Meanwhile, NBWELM does not remarkably increase the computational complexity of WELM due to the simplicity of FNB.
Similar content being viewed by others
References
An L, Bhanu B (2012) Image super-resolution by extreme learning machine. Proceedings International Conference Image Process 2209–2212
Barnett V, Lewis T (1994) Outliers in statistical data. Wiley, Chichester
Choi K, Toh KA, Byun H (2012) Incremental face recognition for large-scale social network services. Pattern Recogn 5(8):2868–2883
Deng W, Zheng Q, Chen L (2009) Regularized extreme learning machine. Proceedings International Symposium Computer Intelligent Data Mining 389–395
Fu AM, Dong CR, Wang LS (2014) An experimental study on stability and generalization of extreme learning machines. Int J Mach Learn & Cybern. doi:10.1007/s13042-014-0238-0
Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18
Hawkins DM (1980) Identification of outliers. Chapman and Hall, London
Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Network 17(4):879–892
Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B 42(2):513–529
Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501
Huang GB, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122
Janez D (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30
John GH, Langley P (1995) Estimating continuous distributions in Bayesian classifiers. Proceedings International Uncertainty Artificial Intelligence 338–345
Jones MC, Marron JS, Sheather SJ (1996) A brief survey of bandwidth selection for density estimation. J Am Stat Assoc 91(433):401–407
Khamis A, Ismail Z, Haron K (2005) The effects of outliers data on neural network performance. J Appl Sci 5:1394–1398
Liano K (1996) Robust error measure for supervised neural network learning with outliers. IEEE Trans Neural Netw 7(1):246–250
Liu X, Gao C, Li P (2012) A comparative analysis of support vector machines and extreme learning machines. Neural Netw 33:58–66
Mirza B, Lin ZP, Toh KA (2013) Weighted online sequential extreme learning machine for class imbalance learning. Neural Process Lett 38(3):465–486
Parzen E (1962) On estimation of a probability density function and mode. Ann Math Stat 33(3):1065–1076
Samet S, Miri A (2012) Privacy-preserving back-propagation and extreme learning machine algorithms. Data Knowl Eng 79:40–61
Toh KA (2008) Deterministic neural classification. Neural Comput 20(6):1565–1595
Wang XZ, Shao QY, Miao Q, Zhai JH (2013) Architecture selection for networks trained with extreme learning machine using localized generalization error model. Neurocomputing 102:3–9
Wang XZ, He YL, Wand DD (2014) Non-naive Bayesian classifiers for classification problems with continuous attributes. IEEE Trans Cybern 44(1):21–39
Xu Y, Dong ZY, Zhao JH, Zhang P, Wong KP (2012) A reliable intelligent system for real-time dynamic security assessment of power systems. IEEE Trans Power Syst 27(3):1253–1263
Zhang WB, Ji HB (2013) Fuzzy extreme learning machine for classification. Electron Lett 49(7):448–450
Zhao G, Shen Z, Miao C, Man Z (2009) On improving the conditioning of extreme learning machine: a linear case. Proceedings International Information Communications and Signal Processing 1–5
Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recogn 38(10):1759–1763
Zong W, Huang GB, Chen Y (2013) Weighted extreme learning machine for imbalance learning. Neurocomputing 101:229–242
Acknowledgments
The authors are very grateful for the editors and anonymous reviewers. Their many valuable and constructive comments and suggestions helped us significantly improve this work.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wang, J., Zhang, L., Cao, Jj. et al. NBWELM: naive Bayesian based weighted extreme learning machine. Int. J. Mach. Learn. & Cyber. 9, 21–35 (2018). https://doi.org/10.1007/s13042-014-0318-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-014-0318-1