Abstract
The exceptional capabilities of the much celebrated Real Adaboost ensembles for solving decision and classification problems are universally recognized. These capabilities come from progressively constructing unstable and weak learners that pay more attention to samples that oppose more difficulties to be correctly classified, and linearly combine them in a progressive manner. However, the corresponding emphasis can be excessive, especially when there is an intensive noise or in the presence of outliers. Although there are many modifications to control the emphasis, they show limited success for imbalanced or asymmetric problems. In this paper, we use the neighborhood concept to design a simple modification of the emphasis mechanisms that is able to deal with these situations. It can also be combined with other emphasis control mechanisms. Experimental results confirm the potential of the proposed modification by itself and also when combined with a previously tested mixed emphasis algorithm. The main conclusions of our work and some suggestions for further research along this line close the paper.
Similar content being viewed by others
References
Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. Wiley, Hoboken
Rokach L (2010) Pattern classification using ensemble methods. World Scientific, Singapore
Schapire RE, Freund Y (2012) Boosting: foundation and algorithms. MIT Press, Cambridge
Schapire RE, Singer Y (1999) Improved boosting algorithms using confidence-rated predictions. Mach Learn 37:297–336
Bauer E, Kohavi R (1997) An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Mach Learn 36:105–139
Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Mach Learn 40:139–157
Breiman L (1999) Prediction games and arcing algorithms. Neural Comput 11:1493–1518
Freund Y (2001) An adaptive version of the boost by majority algorithm. Mach Learn 43:293–318
Gómez-Verdejo V, Ortega-Moral M, Arenas-García J, Figueiras-Vidal AR (2006) Boosting by weighting critical and erroneous samples. Neurocomputing 69:679–685
Gómez-Verdejo V, Arenas-García J, Figueiras-Vidal AR (2008) A dynamically adjusted mixed emphasis method for building boosting ensembles. IEEE Trans Neural Netw 19:3–17
Rätsch G, Onoda T, Müller KR (1999) Regularizing adaboost. In: Kearns M, Solla S, Cohn D (eds) Advances in neural information processing systems, vol 11. MIT Press, Cambridge, pp 564–570
Rätsch G, Onoda T, Müller KR (2001) Soft margins for AdaBoost. Mach Learn 42:287–320
Rätsch G, Warmuth MK (2005) Efficient margin maximizing with boosting. J Mach Learn Res 6:2131–2152
Sun Y, Todorovic S, Li J (2006) Reducing the overfitting of AdaBoost by controlling its data distribution skewness. Pattern Recognit Artif Intell 20:1093–1116
Shen C, Li H (2010) Boosting through optimization of margin distributions. IEEE Trans Neural Netw 21:659–666
Zhang C-X, Zhang J-S, Zhang G-Y (2008) An efficient modified boosting method for solving classification problems. Comput Appl Math 214:381–392
Zhang C-X, Zhang J-S (2008) A local boosting algorithm for solving classification problems. Comput Stat Data Anal 52:1928–1941
Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR Boosting ensembles with subsampling LPSVM learners. Inf Fusion (submitted)
Rangel P, Lozano F, García E (2005) Boosting of Support Vector Machines with application to editing. In: Proceedings of the 4th international conference on machine learning and applications. IEEE Computer Society, Los Angeles, pp 374–379
Li X, Wang L, Sung E (2008) Adaboost with SVM-based component classifiers. Eng Appl Artif Intell 21:785–795
Omari A, Figueiras-Vidal AR (2013) Feature combiners with gate-generated weights for classification. IEEE Trans Neural Netw Learn Syst 24:158–163
Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR (2012) Real Adaboost with gate controlled fusion. IEEE Trans Neural Netw Learn Syst 23:2003–2009
Lyhyaoui A, Martínez-Ramón M, Mora-Jiménez I, Vázquez-Castro M, Sancho-Gómez JL, Figueiras-Vidal AR (1999) Sample selection via clustering to construct support vector-like classifiers. IEEE Trans Neural Netw 10:1474–1481
Shin H, Cho S (2007) Neighborhood property-based pattern selection for Support Vector Machines. Neural Comput 19:816–855
Geebelen D, Suykens JAK, Vandewalle J (2012) Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation. IEEE Trans Neural Netw Learn Syst 23:682–688
UCI Machine Learning Repository. School of Information and Computer Sciences, University of California, Irvine (online). http://archive.ics.uci.edu/ml. Accessed 15 July 2012
Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, Cambridge
Kwok JTY (1999) Moderating the outputs of support vector machine classifiers. IEEE Trans Neural Netw 10:1018–1031
Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1:80–83
Ahachad A, Omari A, Figueiras-Vidal AR (2013) Smoothed emphasis for boosting ensembles. In: Rojas I, Joya G, Cabestany J (eds) Advances in computational intelligence. LNCS, vol 7902. Springer, Berlin, pp 367–375
Acknowledgments
This work has been partly supported by Grant TIN 2011-24533 (Spanish MEC).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ahachad, A., Omari, A. & Figueiras-Vidal, A.R. Neighborhood Guided Smoothed Emphasis for Real Adaboost Ensembles. Neural Process Lett 42, 155–165 (2015). https://doi.org/10.1007/s11063-014-9386-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-014-9386-1