Skip to main content

Ensemble Learning

  • Chapter
  • First Online:
Data Fusion: Concepts and Ideas

Introduction

The subject of this chapter is ensemble learning in which our system is characterized by an ensemble of M models. The models may share the same common representational format or each model may have its own distinct common representational format. To make our discussion more concrete we shall concentrate on the (supervised) classification of an object O using a multiple classifier system (MCS). Given an unknown object O, our goal is to optimally assign it to one of K classes, c k ,k ∈ {1,2,…,K}, using an ensemble of M (supervised) classifiers, S m ,m ∈ {1,2,…,M}. The theory of multiple classifier systems suggests that if the pattern of errors made by one classifier, S m , is different from the pattern of errors made by another classifier, S n , then we may exploit this difference to give a more accurate and more reliable classification of O. If the error rates of the classifiers are less than \(\frac{1}{2}\), then the MCS error rate, \(E_{\text{MCS}}\), should decrease with the number of classifiers, M, and with the mean diversity, \(\bar{\sigma}\), between the classifiers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Avidan, S.: SpatialBoost: Adding spatial reasoning to Adaboost. In: Proc. 9th Euro. Conf. Comp. Vis., pp. 780–785 (2006)

    Google Scholar 

  2. Boulle, M.: Regularization and averaging of the selective naive Bayes classifier. In: Proc. 2006 Int. Joint Conf. Neural Networks, pp. 2989–2997 (2006)

    Google Scholar 

  3. Breiman, L.: Random forests. Mach. Learn. 45, 5–32 (2001)

    Article  MATH  Google Scholar 

  4. Chawla, N.V., Lazarevic, A., Hall, L.O., Bowyer, K.W.: SmoteBoost: Improving Prediction of the Minority Class in Boosting. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) PKDD 2003. LNCS (LNAI), vol. 2838, pp. 107–119. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  5. Dudoit, S., Fridlyand, J.: Bagging to improve the accurcay of a clustering procedure. Bioinformatics 19, 1090–1099 (2003)

    Article  Google Scholar 

  6. Elkan, C.: Boosting and naive Bayes learning. Tech Rept CS97-557. University of California, San Diego (September 1997)

    Google Scholar 

  7. Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Ann. Stat.  38, 337–374 (2000)

    Article  MathSciNet  Google Scholar 

  8. Galar, M., Fernandez, A., Barrenechea, E., Bustince, H., Herrero, F.: Patt. Recogn. 44, 1761–1776 (2011)

    Article  Google Scholar 

  9. Garcia-Pedrajas, N., Ortiz-Boyer, D.: An empirical study of binary classifier fusion methods for multi-class classification. Inf. Fusion. 12, 111–130 (2011)

    Article  Google Scholar 

  10. Ghahramani, Z., Kim, H.-C.: Bayesian classifier combination. Gatsby Tech Rept, University College, University of London, UK (2003)

    Google Scholar 

  11. Gonzales-Barron, U., Butler, F.: J. Food Engng. 74, 268–278 (2006)

    Article  Google Scholar 

  12. Huang, Y.S., Suen, C.Y.: A method of combining multiple experts for the recognition of unconstrained handwritten numerals. IEEE Trans. Patt. Anal. Mach. Intell. 17, 90–94 (1995)

    Article  Google Scholar 

  13. Jordan, M.I., Jacobs, R.A.: Hierarchical mixture of experts and the EM algorithm. Neural Comp. 6, 181–214

    Google Scholar 

  14. Kittler, J., Hatef, M., Duin, R.P.W., Matas, J.: On combining classifiers. IEEE Trans. Patt. Anal. Mach. Intell. 20, 226–239 (1998)

    Article  Google Scholar 

  15. Ko, A.H.R., Sabourin, R., de Souza Britto Jr., A., Oliveria, L.: Pairwise fusion matrix for combining classifiers. Patt. Recogn. 40, 2198–2210 (2007)

    Article  MATH  Google Scholar 

  16. Kuncheva, L.I.: Combining Pattern Classifiers. John Wiley and Sons (2004)

    Google Scholar 

  17. Li, S.Z., Zhang, Z.-Q.: Floatboost learning and statistical face detection. IEEE Trans. Patt. Anal. Mach. Intell. 26, 1112–1123 (2004)

    Article  Google Scholar 

  18. Lumini, A., Nanni, L.: Detector of image orientation based on Borda count. Patt. Recogn. 27, 180–186 (2006)

    Article  Google Scholar 

  19. Martinez-Munoz, G., Suarez, A.: Switching Class Labels to Generate Classification Ensembles. Patt. Recogn. 38, 1483–1494 (2005)

    Article  Google Scholar 

  20. Marzio, M., Taylor, C.C.: On boosting kernel density methods for multivariate data: density estimation and classification. Stat. Meth. Appl. 14, 163–178 (2005)

    Article  MATH  Google Scholar 

  21. Melgani, F.: Robust image binarization with ensembles of thresholding algorithms. Elec. Imag. 15, 023010 (2006)

    Article  Google Scholar 

  22. Minka, T.P.: The “summation trick” as an outlier model. Unpublished article. Available from Minka’s homepage (2003)

    Google Scholar 

  23. Polikar, R.: Ensemble based systems in decision making. IEEE Circuit Syst. Mag. 6, 21–45 (2006)

    Article  Google Scholar 

  24. Ranawana, R.: Multiclassifier systems - review and a roadmap for developers. Int. J. Hybrid Intell. Syst. 3, 35–61 (2006)

    MATH  Google Scholar 

  25. Ridgeway, G., Madigan, D., Richardson, T., O’Kane, J.W.: Interpretable boosted naive Bayes classification. In: Proc. 4th Int. Conf. Know. Discovery Data Mining, pp. 101–104 (1998)

    Google Scholar 

  26. Rokarch, L.: Taxonomy for characterizing ensemble methods in classification tasks: A review and annotated bibliography. Comp. Stat. Data Anal. 53, 4046–4072 (2009)

    Article  Google Scholar 

  27. Schapire, R.E.: The boosting approach to machine learning: An overview. In: Proc. MSRI Workshop Nonlinear Estimation and Classification (2002)

    Google Scholar 

  28. Shen, L., Bai, L.: MutualBoost learning for selecting Gabor features for face recognition. Patt. Recogn. Lett. 27, 1758–1767 (2006)

    Article  Google Scholar 

  29. Shiraishi, Y., Fukumizu, K.: Statistical approaches to combining binary classifiers for multi-class classification. Neurocomp. 74, 680–686 (2011)

    Article  Google Scholar 

  30. Skurichina, M., Duin, R.P.W.: Combining Feature Subsets in Feature Selection. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.) MCS 2005. LNCS, vol. 3541, pp. 165–175. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  31. Tang, E.K., Suganthan, P.N., Yao, X., Qin, A.K.: Linear dimensionality reduction using relevance weighted LDA. Patt. Recogn. 38, 485–493 (2005)

    Article  MATH  Google Scholar 

  32. Tang, E.K., Suganthan, P.N., Yao, X.: An analysis of diversity measures. Mach. Learn. 65, 247–271 (2006)

    Article  Google Scholar 

  33. Tax, D.M.J.: One class classification. PhD thesis, Delft University, The Netherlands (2001)

    Google Scholar 

  34. Ting, K.M., Witten, I.H.: Stacked generalization: when does it work? In: Proc. 15th Int. Joint Conf. Art. Intell. (1997)

    Google Scholar 

  35. Vezhnevets, A., Vezhnevets, V.: Modest AdaBoost-teaching AdaBoost to generalize better. In: 15th Int. Conf. Comp. Graph. Appl. (2005)

    Google Scholar 

  36. Viaene, S., Derrig, R., Dedene, G.: A case study of applying boosting naive Bayes to claim fraud diagnosis. IEEE Trans. Know. Data Engng. 16, 612–619 (2004)

    Article  Google Scholar 

  37. Xu, L., Krzyzak, A., Suen, C.Y.: Several methods for combining multiple classifiers and their applications in handwritten character recognition. IEEE Trans. Syst. Man Cybern. 22, 418–435 (1992)

    Article  Google Scholar 

  38. Webb, G.I.: Multiboosting: A technique combining boosting and wagging. Mach. Learn. 40, 159–197 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to H. B. Mitchell .

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Mitchell, H.B. (2012). Ensemble Learning. In: Data Fusion: Concepts and Ideas. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27222-6_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-27222-6_14

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-27221-9

  • Online ISBN: 978-3-642-27222-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics