Partially Corrective AdaBoost
We propose a novel algorithm to improve the ensemble performance of AdaBoost. The main contribution in our algorithm includes two aspects: (1) we aim to generate a distribution at each step that has less correlation with the previous classifiers so as to enhance the searching efficiency for new classifiers; (2) the classifiers weights can be iteratively modified along with the training process. In the proposed algorithm, the distribution is required to be corrective to some previous classifiers or some previous classifiers’ linear combinations. Experiments on UCI Repository have validated the new algorithm’s effectiveness.
KeywordsCost Function Descent Direction Training Error Subset Selection Variable Selection Method
- 2.Mason, L., Baxter, J., Bartlett, P.L., et al.: Boosting Algorithms as Gradient Descent. In: Advances in Neural Information Processing Systems (NIPS 1999), pp. 512–518 (1999)Google Scholar
- 4.Li, L., Abu-Mostafa, Y.S., Pratap, A.: CGBoost: Conjugate Gradient in Function Space. Technical Report CaltechCSTR: 2003.007, Learning Systems Group, California Institute of Technology (August 2003)Google Scholar
- 5.Kivinen, J., Warmuth, M.K.: Boosting as Entropy Projection. In: 12th Annual Conference on Computational Learning Theory (COLT 1999), pp. 134–144 (1999)Google Scholar