Abstract
Minimal Learning Machine (MLM) is a recently proposed supervised learning algorithm with performance comparable to most state-of-the-art machine learning methods. In this work, we propose ensemble methods for classification and regression using MLMs. The goal of ensemble strategies is to produce more robust and accurate models when compared to a single classifier or regression model. Despite its successful application, MLM employs a computationally intensive optimization problem as part of its test procedure (out-of-sample data estimation). This becomes even more noticeable in the context of ensemble learning, where multiple models are used. Aiming to provide fast alternatives to the standard MLM, we also propose the Nearest Neighbor Minimal Learning Machine and the Cubic Equation Minimal Learning Machine to cope with classification and single-output regression problems, respectively. The experimental assessment conducted on real-world datasets reports that ensemble of fast MLMs perform comparably or superiorly to reference machine learning algorithms.
Similar content being viewed by others
References
Arlandis J, Perez-Cortes J, Cano J (2002) Rejection strategies and confidence measures for a k-nn classifier in an ocr task. In: Proceedings of the 16th international conference on pattern recognition, vol 1, pp 576–579
Ballings M, den Poel DV, Hespeels N, Gryp R (2015) Evaluating multiple classifiers for stock price direction prediction. Expert Syst Appl 42(20):7046–7056
Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140
Cao J, Lin Z, Huang GB, Liu N (2012) Voting based extreme learning machine. Inf Sci 185(1):66–77
Coelho D, Barreto G, Medeiros C, Santos J (2014) Performance comparison of classifiers in the detection of short circuit incipient fault in a three-phase induction motor. In: IEEE symposium on computational intelligence for engineering solutions (CIES), pp 42–48
de Souza Junior AH, Corona F, Miche Y, Lendasse A, Barreto GA, Simula O (2013) Minimal learning machine: a new distance-based method for supervised learning. In: Proceedings of the 12th international conference on artificial neural networks: advances in computational intelligence—volume part I, IWANN’13. Springer, Berlin, Heidelberg, pp 408–416
de Souza Junior AH, Corona F, Barreto GA, Miche Y, Lendasse A (2015) Minimal learning machine: a novel supervised distance-based approach for regression and classification. Neurocomputing 164:34–44
Dietterich TG (2000) Ensemble methods in machine learning. In: Proceedings of the first international workshop on multiple classifier systems, MCS’00. Springer, London, UK, pp 1–15
Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: International conference on machine learning, pp 148–156
Garcia-Pedrajas N, Hervas-Martinez C, Ortiz-Boyer D (2005) Cooperative coevolution of artificial neural network ensembles for pattern classification. IEEE Trans Evol Comput 9(3):271–302
Gomes JPP, Souza Junior AH, Corona F, Neto ARR (2015) A cost sensitive minimal learning machine for pattern classification. In: Arik S, Huang T, Lai WK, Liu Q (eds) International conference on neural information processing (ICONIP). Lecture notes in computer science, vol 9489. Springer, pp 557–564
Guo L, Ma Y, Cukic B, Singh H (2004) Robust prediction of fault-proneness by random forests. In: Proceedings of the 15th international symposium on software reliability engineering, pp 417–428
Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001
Jung C, Shen Y, Jiao L (2014) Learning to rank with ensemble ranking svm. Neural Process Lett 42(3):703–714
Latkowski T, Osowski S (2015) Developing gene classifier system for autism recognition. In: Proceedings of the advances in computational intelligence—13th international work-conference on artificial neural networks, IWANN 2015, Palma de Mallorca, Spain. Part II, pp 3–14, 10–12 June 2015
Liu N, Wang H (2010) Ensemble based extreme learning machine. IEEE Signal Process Lett 17(8):754–757
Mendes-Moreira J, Soares C, Jorge AMJ, Sousa JF (2012) Ensemble approaches for regression: a survey. ACM Comput Surv 45(1):10
Mesquita DPP, Gomes JPP, Souza Junior AH (2015) Ensemble of minimal learning machines for pattern classification. In: Rojas I, Caparrós GJ, A. Català (eds) International work conference on artificial neural networks. Lecture notes in computer science, vol 9095. Springer, pp 142–152
Niewiadomska-Szynkiewicz E, Marks M (2009) Optimization schemes for wireless sensor network localization. Int J Appl Math Comput Sci 19(2):291–302
Patel H, Temple M, Baldwin R (2015) Improving zigbee device network authentication using ensemble decision tree classifiers with radio frequency distinct native attribute fingerprinting. IEEE Trans Reliab 64(1):221–233
Rooney N, Patterson DW, Anand SS, Tsymbal A (2004) Dynamic integration of regression models. In: Roli F, Kittler J, Windeatt T (eds) Multiple Classifier Systems. Lecture notes in computer science, vol 3077. Springer, pp 164–173
Suzuki J (2009) Mathematics in historical context, 1st edn. Mathematical Association of America, Washington
Acknowledgements
The authors would like to thank the Brazilian National Council for Scientific and Technological Development (CNPq) for the financial support (Grant 456837/2014-0).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Mesquita, D.P.P., Gomes, J.P.P. & Souza Junior, A.H. Ensemble of Efficient Minimal Learning Machines for Classification and Regression. Neural Process Lett 46, 751–766 (2017). https://doi.org/10.1007/s11063-017-9587-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-017-9587-5