Abstract
We propose a new machine learning algorithm: meta-boosting. Using the boosting method a weak learner can be converted into a strong learner by changing the weight distribution of the training examples. It is often regarded as a method for decreasing both the bias and variance although it mainly reduces variance. Meta-learning has the advantage of coalescing the results of multiple learners to improve accuracy, which is a bias reduction method. By combing boosting algorithms with different weak learners using the meta-learning scheme, both of the bias and variance are reduced. Our experiments demonstrate that this meta-boosting algorithm not only displays superior performance than the best results of the base-learners but that it also surpasses other recent algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Wolpert, D., Macready, W.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1, 67–82 (19 97)
Gama, J.: Combining classification algorithms. PhD Thesis, University of Porto (1999)
Schapire, R.E.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)
Philip, K.C., Salvatore, J.S.: Meta-learning for multistrategy and parallel learning. In: Proc. Second Intl. Work. Multistraegy Learning, pp. 150–165 (1993)
de Souza, É.N., Matwin, S.: Extending AdaBoost to Iteratively Vary Its Base Classifiers. In: Butz, C., Lingras, P. (eds.) Canadian AI 2011. LNCS, vol. 6657, pp. 384–389. Springer, Heidelberg (2011)
Kohavi, R., Wolpert, D.H.: Bias plus variance decomposition for zero-one loss functions. In: 13th International Conference on Machine Learning, pp. 275–283 (1996)
Dietterich, T.G.: Bias-variance analysis of ensemble learning. In: 7th Course of the International School on Neural Networks, Ensemble Methods for Learning Machines (2002)
Holte, R.C.: Very simple classification rules perform well on most commonly used datasets. Machine Learning 11, 63–91 (1993)
Domingos, P., Pazzani, M.: On the optimality of the simple Bayesian classifier under zero-one loss. Machine Learning 29, 103–130 (1997)
Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36, 105–139 (1999)
Domingos, P.: A unified bias-variance decomposition. In: Proceedings of 17th International Conference on Machine Learning. Morgan Kaufmann, Stanford (2000)
Webb, I., Zheng, Z.: Multistrategy ensemble learning: Reducing error by combining ensemble learning technique. IEEE Tran. on Knowledge and Data Engineering 16, 980–991 (2004)
Philip, K.C., Salvatore, J.S.: Scaling learning by metalearning over disjoint and partially replicated data. In: Proc. Ninth Florida AI Research Symposium, pp. 151–155 (1996)
Frank, A., Asuncion, A.: UCI machine learning repository (2010)
Aha, D.W.: Lazy learning. Kluwer Academic Publishers (1997)
Quinlan, J.R.: C4.5: Programs for machine learning, vol. 1. Morgan Kaufmann (1993)
Kotsiantis, S.B., Zaharakis, I.D., Pintelas, P.E.: Machine learning: A review of classification and combining techniques. Artificial Intelligence Review 26, 159–190 (2006)
Japkowicz, N., Mohak, S.: Evaluating Learning Algorithms: A Classification Perspective. Cambridge University Press (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liu, X., Wang, X., Japkowicz, N., Matwin, S. (2013). An Ensemble Method Based on AdaBoost and Meta-Learning. In: Zaïane, O.R., Zilles, S. (eds) Advances in Artificial Intelligence. Canadian AI 2013. Lecture Notes in Computer Science(), vol 7884. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-38457-8_27
Download citation
DOI: https://doi.org/10.1007/978-3-642-38457-8_27
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-38456-1
Online ISBN: 978-3-642-38457-8
eBook Packages: Computer ScienceComputer Science (R0)