Advertisement

A Hybrid Approach of Boosting Against Noisy Data

  • Emna Bahri
  • Stephane Lallich
  • Nicolas Nicoloyannis
  • Maddouri Mondher
Part of the Studies in Computational Intelligence book series (SCI, volume 165)

Abstract

To reduce error in generalization, a great number of work is carried out on the classifiers aggregation methods in order to improve generally, by voting techniques, the performance of a single classifier. Among these methods of aggregation, we find the Boosting which is most practical thanks to the adaptive update of the distribution of the examples aiming at increasing in an exponential way the weight of the badly classified examples. However, this method is blamed because of overfitting, and the convergence speed especially with noise. In this study, we propose a new approach and modifications carried out on the algorithm of AdaBoost. We will demonstrate that it is possible to improve the performance of the Boosting, by exploiting assumptions generated with the former iterations to correct the weights of the examples. An experimental study shows the interest of this new approach, called hybrid approach.

Keywords

Error Rate Hybrid Approach Noisy Data Current Iteration Generalization Error 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Vezhnevets, A., Vezhnevets, V.: Modest adaboost: Teaching adaboost to generalize better, Moscow State University (2002)Google Scholar
  2. 2.
    Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 24, 173–202 (1999)Google Scholar
  3. 3.
    Breiman, L.: Bagging predictors. Machine Learning 26, 123–140 (1996)Google Scholar
  4. 4.
    Brodley, C.E., Friedl, M.A.: Identifying and eliminating mislabeled training instances. In: AAAI/IAAI, vol. 1, pp. 799–805 (1996)Google Scholar
  5. 5.
    Dharmarajan, R.: An effecient boosting algorithm for combining preferences. Technical report, MIT (September 1999)Google Scholar
  6. 6.
    Dietterich, T.G.: An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Machine Learning, 1–22 (1999)Google Scholar
  7. 7.
    Dietterich, T.G.: Ensemble methodes in machine learning. In: First International Workshop on Multiple ClassifierSystems, pp. 1–15 (2000)Google Scholar
  8. 8.
    Domingo, C., Watanabe, O.: Madaboost: A modification of adaboost. In: Proc. 13th Annu. Conference on Comput. Learning Theory, pp. 180–189. Morgan Kaufmann, San Francisco (2000)Google Scholar
  9. 9.
    Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Dept. of Statistics, Stanford University Technical Report (1998)Google Scholar
  10. 10.
    Friedman, J.H., Popescu, B.E.: Predictive learning via rule ensembles (technical report). Stanford University (7) (2005)Google Scholar
  11. 11.
    Kohavi, R.: A study of cross-validation and bootstrap for accuracy estimation and model selection. In: International Joint Conference on Artificial Intelligence (IJCAI) (1995)Google Scholar
  12. 12.
    Littlestone, N., Warmuth, M.K.: The weighted majority algorithm. In: Information and computation, vol. 24, pp. 212–261 (1994)Google Scholar
  13. 13.
    Maclin, R.: Boosting classifiers regionally. In: AAAI/IAAI, pp. 700–705 (1998)Google Scholar
  14. 14.
    McDonald, R., Hand, D., Eckley, I.: An empirical comparison of three boosting algorithms on real data sets with artificial class noise. In: Fourth International Workshop on Multiple Classifier Systems, pp. 35–44 (2003)Google Scholar
  15. 15.
    Meir, R., El-Yaniv, R., Ben-David, S.: Localized boosting. In: Proc. 13th Annu. Conference on Comput. Learning Theory, pp. 190–199. Morgan Kaufmann, San Francisco (2000)Google Scholar
  16. 16.
    Rtsch, G.: Ensemble learning methods for classification. Master’s thesis, Dep. of computer science, University of Potsdam (April 1998)Google Scholar
  17. 17.
    Rätsch, G., Onoda, T., Müller, K.-R.: Soft margins for adaboost. Mach. Learn. 42(3), 287–320 (2001)zbMATHCrossRefGoogle Scholar
  18. 18.
    Schapire, R.E., Singer, Y.: Improved boosting algorithms using confedence rated predictions. Machine Learning 37(3), 297–336 (1999)zbMATHCrossRefGoogle Scholar
  19. 19.
    Sebban, M., Suchier, H.M.: tude sur amlioration du boosting: rduction de l’erreur et acclration de la convergence. Journal lectronique d’intelligence artificielle, 200–214 (2003)Google Scholar
  20. 20.
    Servedio, R.A.: Smooth boosting and learning with malicious noise. In: Helmbold, D.P., Williamson, B. (eds.) COLT 2001 and EuroCOLT 2001. LNCS (LNAI), vol. 2111, pp. 473–489. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  21. 21.
    Shapire, R.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)Google Scholar
  22. 22.
    Kwek, S., Nguyen, C.: iboost: Boosting using an instance-based exponential weighting scheme. In: Thirteenth European Conference on Machine Learning, pp. 245–257 (2002)Google Scholar
  23. 23.
    Stolfo, S.J., Fan, W., Lee, W., Prodromidis, A., Chan, P.K.: Cost-based modeling and evaluation for data mining with application to fraud and intrusion detection (1999)Google Scholar
  24. 24.
    Torre, F.: Globoost: Boosting de moindres gnraliss. Technical report, GRAppA - Universit Charles de Gaulle - Lille 3 (September 2004)Google Scholar
  25. 25.
    Wilson, D.R., Martinez, T.R.: Reduction techniques for instance-based learning algorithms. Machine Learning 38(3), 257–286 (2000)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Emna Bahri
    • 1
    • 2
  • Stephane Lallich
    • 1
    • 2
  • Nicolas Nicoloyannis
    • 1
    • 2
  • Maddouri Mondher
    • 1
    • 2
  1. 1.ERIC Laboratory- 5University of Lyon 2Bron cedexFrance
  2. 2.INSAT zone urbaine la charguia IITunisTunisie

Personalised recommendations