The AdaBoost Algorithm with Linear Modification of the Weights

  • Robert BurdukEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 681)


This paper presents a new extension of the AdaBoost algorithm. This extension concerns the weights used in this algorithm. In our approach the original weights are modified, we propose a linear modification of the weights. In our study we use the boosting by the reweighting method where each weak classifier is based on the linear classifier. The described algorithm was tested on Pima data set. The obtained results are compared with the original the AdaBoost algorithm.


AdaBoost algorithm Ensemble of classifiers Linear classifier 



This work was supported by the statutory funds of the Department of Systems and Computer Networks, Wroclaw University of Science and Technology.


  1. 1.
    Burduk, R.: The AdaBoost algorithm with the imprecision determine the weights of the observations. In: Asian Conference on Intelligent Information and Database Systems, pp. 110–116. Springer, Cham (2014)Google Scholar
  2. 2.
    Dmitrienko, A., Chuang-Stein, C., D’Agostino, R.B.: Pharmaceutical statistics using SAS: a practical guide. SAS Institute (2007)Google Scholar
  3. 3.
    Forczmański, P., Łabędź, P.: Recognition of occluded faces based on multi-subspace classification. In: Computer Information Systems and Industrial Management, pp. 148–157. Springer, Heidelberg (2013)Google Scholar
  4. 4.
    Frejlichowski, D.: An algorithm for the automatic analysis of characters located on car license plates. In: International Conference Image Analysis and Recognition, pp. 774–781. Springer, Heidelberg (2013)Google Scholar
  5. 5.
    Freund, Y., Schapire, R.E.: A desicion-theoretic generalization of on-line learning and an application to boosting. In: European Conference on Computational Learning Theory, pp. 23–37. Springer, Heidelberg (1995)Google Scholar
  6. 6.
    Freund, Y., Schapire, R.E., et al.: Experiments with a new boosting algorithm. ICML 1996, 148–156 (1996)Google Scholar
  7. 7.
    Kearns, M., Valiant, L.: Cryptographic limitations on learning boolean formulae and finite automata. J. ACM (JACM) 41(1), 67–95 (1994)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Kozik, R., Choraś, M.: The HTTP content segmentation method combined with adaboost classifier for web-layer anomaly detection system. In: International Conference on EUropean Transnational Education, pp. 555–563. Springer, Heidelberg (2016)Google Scholar
  9. 9.
    Oza, N.C.: Boosting with averaged weight vectors. Multiple Classifier Systems 2709, 15–24 (2003)CrossRefzbMATHGoogle Scholar
  10. 10.
    Rejer, I.: Genetic algorithms for feature selection for brain-computer interface. Int. J. Pattern Recogn. Artif. Intell. 29(5), 1559008 (2015)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Rejer, I., Burduk, R.: Classifier selection for motor imagery brain computer interface. In: IFIP International Conference on Computer Information Systems and Industrial Management, pp. 122–130. Springer, Heidelberg (2017)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Department of Systems and Computer NetworksWroclaw University of Science and TechnologyWroclawPoland

Personalised recommendations