Skip to main content
Log in

Neighborhood Guided Smoothed Emphasis for Real Adaboost Ensembles

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

The exceptional capabilities of the much celebrated Real Adaboost ensembles for solving decision and classification problems are universally recognized. These capabilities come from progressively constructing unstable and weak learners that pay more attention to samples that oppose more difficulties to be correctly classified, and linearly combine them in a progressive manner. However, the corresponding emphasis can be excessive, especially when there is an intensive noise or in the presence of outliers. Although there are many modifications to control the emphasis, they show limited success for imbalanced or asymmetric problems. In this paper, we use the neighborhood concept to design a simple modification of the emphasis mechanisms that is able to deal with these situations. It can also be combined with other emphasis control mechanisms. Experimental results confirm the potential of the proposed modification by itself and also when combined with a previously tested mixed emphasis algorithm. The main conclusions of our work and some suggestions for further research along this line close the paper.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

References

  1. Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. Wiley, Hoboken

    Book  Google Scholar 

  2. Rokach L (2010) Pattern classification using ensemble methods. World Scientific, Singapore

    Google Scholar 

  3. Schapire RE, Freund Y (2012) Boosting: foundation and algorithms. MIT Press, Cambridge

    Google Scholar 

  4. Schapire RE, Singer Y (1999) Improved boosting algorithms using confidence-rated predictions. Mach Learn 37:297–336

    Article  Google Scholar 

  5. Bauer E, Kohavi R (1997) An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Mach Learn 36:105–139

    Article  Google Scholar 

  6. Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Mach Learn 40:139–157

    Article  Google Scholar 

  7. Breiman L (1999) Prediction games and arcing algorithms. Neural Comput 11:1493–1518

    Article  Google Scholar 

  8. Freund Y (2001) An adaptive version of the boost by majority algorithm. Mach Learn 43:293–318

    Article  Google Scholar 

  9. Gómez-Verdejo V, Ortega-Moral M, Arenas-García J, Figueiras-Vidal AR (2006) Boosting by weighting critical and erroneous samples. Neurocomputing 69:679–685

    Article  Google Scholar 

  10. Gómez-Verdejo V, Arenas-García J, Figueiras-Vidal AR (2008) A dynamically adjusted mixed emphasis method for building boosting ensembles. IEEE Trans Neural Netw 19:3–17

    Article  Google Scholar 

  11. Rätsch G, Onoda T, Müller KR (1999) Regularizing adaboost. In: Kearns M, Solla S, Cohn D (eds) Advances in neural information processing systems, vol 11. MIT Press, Cambridge, pp 564–570

    Google Scholar 

  12. Rätsch G, Onoda T, Müller KR (2001) Soft margins for AdaBoost. Mach Learn 42:287–320

    Article  Google Scholar 

  13. Rätsch G, Warmuth MK (2005) Efficient margin maximizing with boosting. J Mach Learn Res 6:2131–2152

    Google Scholar 

  14. Sun Y, Todorovic S, Li J (2006) Reducing the overfitting of AdaBoost by controlling its data distribution skewness. Pattern Recognit Artif Intell 20:1093–1116

    Article  Google Scholar 

  15. Shen C, Li H (2010) Boosting through optimization of margin distributions. IEEE Trans Neural Netw 21:659–666

    Article  Google Scholar 

  16. Zhang C-X, Zhang J-S, Zhang G-Y (2008) An efficient modified boosting method for solving classification problems. Comput Appl Math 214:381–392

    Article  Google Scholar 

  17. Zhang C-X, Zhang J-S (2008) A local boosting algorithm for solving classification problems. Comput Stat Data Anal 52:1928–1941

    Article  Google Scholar 

  18. Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR Boosting ensembles with subsampling LPSVM learners. Inf Fusion (submitted)

  19. Rangel P, Lozano F, García E (2005) Boosting of Support Vector Machines with application to editing. In: Proceedings of the 4th international conference on machine learning and applications. IEEE Computer Society, Los Angeles, pp 374–379

  20. Li X, Wang L, Sung E (2008) Adaboost with SVM-based component classifiers. Eng Appl Artif Intell 21:785–795

    Article  Google Scholar 

  21. Omari A, Figueiras-Vidal AR (2013) Feature combiners with gate-generated weights for classification. IEEE Trans Neural Netw Learn Syst 24:158–163

    Article  Google Scholar 

  22. Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR (2012) Real Adaboost with gate controlled fusion. IEEE Trans Neural Netw Learn Syst 23:2003–2009

    Article  Google Scholar 

  23. Lyhyaoui A, Martínez-Ramón M, Mora-Jiménez I, Vázquez-Castro M, Sancho-Gómez JL, Figueiras-Vidal AR (1999) Sample selection via clustering to construct support vector-like classifiers. IEEE Trans Neural Netw 10:1474–1481

    Article  Google Scholar 

  24. Shin H, Cho S (2007) Neighborhood property-based pattern selection for Support Vector Machines. Neural Comput 19:816–855

    Article  Google Scholar 

  25. Geebelen D, Suykens JAK, Vandewalle J (2012) Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation. IEEE Trans Neural Netw Learn Syst 23:682–688

    Article  Google Scholar 

  26. UCI Machine Learning Repository. School of Information and Computer Sciences, University of California, Irvine (online). http://archive.ics.uci.edu/ml. Accessed 15 July 2012

  27. Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, Cambridge

    Book  Google Scholar 

  28. Kwok JTY (1999) Moderating the outputs of support vector machine classifiers. IEEE Trans Neural Netw 10:1018–1031

    Article  Google Scholar 

  29. Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1:80–83

    Article  Google Scholar 

  30. Ahachad A, Omari A, Figueiras-Vidal AR (2013) Smoothed emphasis for boosting ensembles. In: Rojas I, Joya G, Cabestany J (eds) Advances in computational intelligence. LNCS, vol 7902. Springer, Berlin, pp 367–375

    Chapter  Google Scholar 

Download references

Acknowledgments

This work has been partly supported by Grant TIN 2011-24533 (Spanish MEC).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anas Ahachad.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ahachad, A., Omari, A. & Figueiras-Vidal, A.R. Neighborhood Guided Smoothed Emphasis for Real Adaboost Ensembles. Neural Process Lett 42, 155–165 (2015). https://doi.org/10.1007/s11063-014-9386-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-014-9386-1

Keywords

Navigation