Abstract
A boosting algorithm [1, 2] could be seen as a way to improve the fit of statistical models. Typically, M predictions are operated by applying a base procedure—called a weak learner—to M reweighted samples. Specifically, in each reweighted sample an individual weight is assigned to each observation. Finally, the output is obtained by aggregating through majority voting. Boosting is a sequential ensemble scheme, in the sense the weight of an observation at step m depends (only) on the step m − 1. It appears clear that we obtain a specific boosting scheme when we choose a loss function, which orientates the data re-weighting mechanism, and a weak learner.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
R. E. Schapire, “The strength of weak learnability,” Machine Learning, vol. 5, pp. 197–227, 1990.
Y. Freund, “Boosting a weak learning algorithm by majority,” Information and Computation/information and Control, vol. 121, pp. 256–285, 1995.
M. D. Marzio and C. C. Taylor, “Using small bias nonparametric density estimators for confidence interval estimation using small bias nonparametric density estimators for confidence interval estimation,” Journal of Nonparametric Statistics, vol. 21, pp. 229–240, 2009.
P. Buhlmann and T. Hothorn, “Boosting Algorithms: Regularization, Prediction and Model Fitting,” Statistical Science, vol. 22, no. 4, pp. 477–505, 2007.
Y. Freund and R. E. Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting,” in European Conference on Computational Learning Theory, pp. 23–37, 1995.
R. A. Jacobs, M. I. Jordan, S. J. Nowlan, and G. E. Hinton, “Adaptative mixture of local experts,” Neural Computation, vol. 3, pp. 1–12, 1991.
M. C. Jones, O. Linton, and J. P. Nielsen, “A simple bias reduction method for density estimation,” Biometrika, vol. 82, pp. 327–338, 1995.
M. D. Marzio and C. C. Taylor, “Kernel density classification and boosting: an l2 analysis,” Statistics and Computing, vol. 15, pp. 113–123, 2005.
M. D. Marzio and C. C. Taylor, “On boosting kernel density methods for multivariate data: density estimation and classification,” Statistical Methods and Applications, vol. 14, pp. 163–178, 2005.
M. D. Marzio and C. C. Taylor, “Boosting kernel density estimates: A bias reduction technique?,” Biometrika, vol. 91, pp. 226–233, 2004.
J. Friedman, T. Hastie, and R. Tibshirani, “Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors),” Annals of Statistics, vol. 28, pp. 337–407, 2000.
I. S. Abramson, “On Bandwidth Variation in Kernel Estimates-A Square Root Law,” The Annals of Statistics, vol. 4, pp. 1217–1223, 1982.
J. W. Tukey, Exploratory Data Analysis. Addison-Wesley, Philippines, 1977.
M. D. Marzio and C. C. Taylor, “On boosting kernel regression,” Journal of Statistical Planning and Inference, vol. 138, pp. 2483–2498, 2008.
J. A. Rice, “Boundary modifications for kernel regreession,” Comm. Statist. Theory Meth., vol. 13, pp. 893–900, 1984.
M. C. Jones, “Simple boundary correction for kernel density estimation,” Statistics and Computing, vol. 3, pp. 135–146, 1993.
T. Gasser, H.-G. Müller, and V. Mammitzsch, “Kernels for nonparametric curve estimation,” Journal Royal Statist. Soc. B, vol. 47, pp. 238–252, 1985.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Di Marzio, M., Taylor, C.C. (2012). Boosting Kernel Estimators. In: Zhang, C., Ma, Y. (eds) Ensemble Machine Learning. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-9326-7_3
Download citation
DOI: https://doi.org/10.1007/978-1-4419-9326-7_3
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-9325-0
Online ISBN: 978-1-4419-9326-7
eBook Packages: EngineeringEngineering (R0)