Skip to main content

Boosting in Linear Discriminant Analysis

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1857))

Abstract

In recent years, together with bagging [5] and the random subspace method [15], boosting [6] became one of the most popular combining techniques that allows us to improve a weak classifier. Usually, boosting is applied to Decision Trees (DT’s). In this paper, we study boosting in Linear Discriminant Analysis (LDA). Simulation studies, carried out for one artificial data set and two real data sets, show that boosting might be useful in LDA for large training sample sizes while bagging is useful for critical training sample sizes [11]. In this paper, in contrast to a common opinion, we demonstrate that the usefulness of boosting does not depend on the instability of a classifier.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jain, A.K., Chandrasekaran, B.: Dimensionality and Sample Size Considerations in Pattern Recognition Practice. In: Krishnaiah, P.R., Kanal, L.N. (eds.): Handbook of Statistics, Vol. 2. North-Holland, Amsterdam (1987) 835–855

    Google Scholar 

  2. Friedman, J.H.: Regularized Discriminant Analysis. JASA 84 (1989) 165–175

    Google Scholar 

  3. An, G.: The Effects of Adding Noise During Backpropagation Training on a Generalization Performance. Neural Computation 8 (1996) 643–674

    Article  Google Scholar 

  4. Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman and Hall, New York (1993)

    MATH  Google Scholar 

  5. Breiman, L.: Bagging predictors. Machine Learning Journal 24(2) (1996) 123–140

    MATH  MathSciNet  Google Scholar 

  6. Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Machine Learning: Proceedings of the Thirteenth International Conference (1996) 148–156

    Google Scholar 

  7. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.: Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods. The Annals of Statistics 26(5) (1998) 1651–1686

    Article  MATH  MathSciNet  Google Scholar 

  8. Breiman, L.: Arcing Classifiers. Annals of Statistics, 26(3) (1998) 801–849

    Article  MATH  MathSciNet  Google Scholar 

  9. Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. Technical Report (1999)

    Google Scholar 

  10. Dietterich, T.G.: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning, to appear

    Google Scholar 

  11. Skurichina, M., Duin, R.P.W.: Bagging for Linear Classifiers. Pattern Recognition 31(7) (1998) 909–930

    Article  Google Scholar 

  12. Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press (1990) 400–407

    Google Scholar 

  13. Cortes, C., Vapnik, V.: Support-Vector Networks. Machine Learning 20 (1995) 273–297

    MATH  Google Scholar 

  14. Blake, C.L. & Merz, C.J. (1998). UCI Repository of machine learning databases http://www.ics.uci.edu/~mlearn/MLRepository.html. Irvine,CA: University of California, Department of Information and Computer Science

    Google Scholar 

  15. Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8) 1998 832–844

    Article  Google Scholar 

  16. Avnimelech, R., Intrator, N.: Boosted Mixture of Experts: An Ensemble Learning Scheme. Neural Computation 11 (1999) 483–497

    Article  Google Scholar 

  17. Skurichina, M., Duin, R.P.W.: The Role of Combining Rules in Bagging and Boosting. Submitted to S+SSPR 2000, Alicante, Spain

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Skurichina, M., Duin, R.P.W. (2000). Boosting in Linear Discriminant Analysis. In: Multiple Classifier Systems. MCS 2000. Lecture Notes in Computer Science, vol 1857. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45014-9_18

Download citation

  • DOI: https://doi.org/10.1007/3-540-45014-9_18

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67704-8

  • Online ISBN: 978-3-540-45014-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics