Boosting in Linear Discriminant Analysis

  • Marina Skurichina
  • Robert P. W. Duin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1857)

Abstract

In recent years, together with bagging [5] and the random subspace method [15], boosting [6] became one of the most popular combining techniques that allows us to improve a weak classifier. Usually, boosting is applied to Decision Trees (DT’s). In this paper, we study boosting in Linear Discriminant Analysis (LDA). Simulation studies, carried out for one artificial data set and two real data sets, show that boosting might be useful in LDA for large training sample sizes while bagging is useful for critical training sample sizes [11]. In this paper, in contrast to a common opinion, we demonstrate that the usefulness of boosting does not depend on the instability of a classifier.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jain, A.K., Chandrasekaran, B.: Dimensionality and Sample Size Considerations in Pattern Recognition Practice. In: Krishnaiah, P.R., Kanal, L.N. (eds.): Handbook of Statistics, Vol. 2. North-Holland, Amsterdam (1987) 835–855Google Scholar
  2. 2.
    Friedman, J.H.: Regularized Discriminant Analysis. JASA 84 (1989) 165–175Google Scholar
  3. 3.
    An, G.: The Effects of Adding Noise During Backpropagation Training on a Generalization Performance. Neural Computation 8 (1996) 643–674CrossRefGoogle Scholar
  4. 4.
    Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman and Hall, New York (1993)MATHGoogle Scholar
  5. 5.
    Breiman, L.: Bagging predictors. Machine Learning Journal 24(2) (1996) 123–140MATHMathSciNetGoogle Scholar
  6. 6.
    Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Machine Learning: Proceedings of the Thirteenth International Conference (1996) 148–156Google Scholar
  7. 7.
    Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.: Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods. The Annals of Statistics 26(5) (1998) 1651–1686MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Breiman, L.: Arcing Classifiers. Annals of Statistics, 26(3) (1998) 801–849MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. Technical Report (1999)Google Scholar
  10. 10.
    Dietterich, T.G.: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning, to appearGoogle Scholar
  11. 11.
    Skurichina, M., Duin, R.P.W.: Bagging for Linear Classifiers. Pattern Recognition 31(7) (1998) 909–930CrossRefGoogle Scholar
  12. 12.
    Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press (1990) 400–407Google Scholar
  13. 13.
    Cortes, C., Vapnik, V.: Support-Vector Networks. Machine Learning 20 (1995) 273–297MATHGoogle Scholar
  14. 14.
    Blake, C.L. & Merz, C.J. (1998). UCI Repository of machine learning databases http://www.ics.uci.edu/~mlearn/MLRepository.html. Irvine,CA: University of California, Department of Information and Computer ScienceGoogle Scholar
  15. 15.
    Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8) 1998 832–844CrossRefGoogle Scholar
  16. 16.
    Avnimelech, R., Intrator, N.: Boosted Mixture of Experts: An Ensemble Learning Scheme. Neural Computation 11 (1999) 483–497CrossRefGoogle Scholar
  17. 17.
    Skurichina, M., Duin, R.P.W.: The Role of Combining Rules in Bagging and Boosting. Submitted to S+SSPR 2000, Alicante, SpainGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2000

Authors and Affiliations

  • Marina Skurichina
    • 1
  • Robert P. W. Duin
    • 1
  1. 1.Pattern Recognition Group, Department of Applied Physics, Faculty of Applied SciencesDelft University of TechnologyDelftThe Netherlands

Personalised recommendations