Abstract
Combining machine learning models is a means of improving overall accuracy. Various algorithms have been proposed to create aggregate models from other models, and two popular examples for classification are Bagging and AdaBoost. In this paper we examine their adaptation to regression, and benchmark them on synthetic and real-world data. Our experiments reveal that different types of AdaBoost algorithms require different complexities of base models. They outperform Bagging at their best, but Bagging achieves a consistent level of success with all base models, providing a robust alternative.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Breiman, L., “Bagging Predictors”, Machine Learning, Vol. 24, No. 2, pp. 123–140, 1996.
Freund, Y. and R. E. Schapire, “A Decision-Theoretic Generalization of On-line Learning and an Application to Boosting”, European Conf. on Computational Learning Theory, pp. 23–37, 1995.
Freund, Y. and R. E. Schapire, “Experiments with a New Boosting Algorithm”, International Conf. on Machine Learning, pp. 148–156, 1996.
Ridgeway, G., D. Madigan and T. Richardson, “Boosting methodology for regression problems”, Proc. of Artificial Intelligence and Statistics, pp. 152–161, 1999.
Drucker, H., “Improving regressors using boosting techniques”, Proc. 14th International Conf. on Machine Learning, pp. 107–115, Morgan Kaufmann, San Francisco, CA, 1997.
Zemel, R. S. and T. Pitassi, “A Gradient-Based Boosting Algorithm for Regression Problems”, Adv. in Neural Information Processing Systems, Vol. 13, 2001.
Friedman, J. H., Greedy Function Approximation: a Gradient Boosting Machine, Tech. Rep. 7, Stanford University, Dept. of Statistics, 1999.
Duffy, N. and D. Helmbold, “Leveraging for Regression”, Proc. 13th Annual Conf. on Computational Learning Theory, pp. 208–219, Morgan Kaufmann, San Francisco, CA, 2000.
Rätsch, G., M. Warmuth, S. Mika, T. Onoda, S. Lemm and K.-R. Müller, “Barrier Boosting”, Proc. 13th Annual Conference on Computational Learning Theory, 2000.
Alpaydın, E., “Combined 5 × 2cv F Test for Comparing Supervised Classification Learning Algorithms”, Neural Computation, Vol. 11, No. 8, pp. 1885–1992, 1999.
Blake, C. and P. M. Murphy, “UCI Repository of Machine Learning Databases”, http://www.ics.uci.edu/&~mlearn/MLRepository.html.
Hosmer, D. and S. Lemeshow, Applied Logistic Regression, John Wiley & Sons Inc., 2nd edn., 2000.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Barutçuoğlu, Z., Alpaydın, E. (2003). A Comparison of Model Aggregation Methods for Regression. In: Kaynak, O., Alpaydin, E., Oja, E., Xu, L. (eds) Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003. ICANN ICONIP 2003 2003. Lecture Notes in Computer Science, vol 2714. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44989-2_10
Download citation
DOI: https://doi.org/10.1007/3-540-44989-2_10
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40408-8
Online ISBN: 978-3-540-44989-8
eBook Packages: Springer Book Archive