Computational Statistics

, Volume 29, Issue 1–2, pp 3–35 | Cite as

Model-based boosting in R: a hands-on tutorial using the R package mboost

  • Benjamin Hofner
  • Andreas Mayr
  • Nikolay Robinzonov
  • Matthias Schmid
Original Paper

Abstract

We provide a detailed hands-on tutorial for the R add-on package mboost. The package implements boosting for optimizing general risk functions utilizing component-wise (penalized) least squares estimates as base-learners for fitting various kinds of generalized linear and generalized additive models to potentially high-dimensional data. We give a theoretical background and demonstrate how mboost can be used to fit interpretable models of different complexity. As an example we use mboost to predict the body fat based on anthropometric measurements throughout the tutorial.

Keywords

Boosting Component-wise functional gradient descent   Generalized additive models Tutorial 

References

  1. Bates D, Maechler M, Bolker B (2011) lme4: linear mixed-effects models using S4 classes. http://CRAN.R-project.org/package=lme4, R package version 0.999375-42
  2. Breiman L (1998) Arcing classifiers (with discussion). Ann Stat 26:801–849CrossRefMATHMathSciNetGoogle Scholar
  3. Breiman L (1999) Prediction games and arcing algorithms. Neural Comput 11:1493–1517CrossRefGoogle Scholar
  4. Breiman L (2001) Random forests. Mach Learn 45:5–32CrossRefMATHGoogle Scholar
  5. Bühlmann P (2006) Boosting for high-dimensional linear models. Ann Stat 34:559–583CrossRefMATHGoogle Scholar
  6. Bühlmann P, Hothorn T (2007) Boosting algorithms: regularization, prediction and model fitting (with discussion). Stat Sci 22:477–522CrossRefMATHGoogle Scholar
  7. Bühlmann P, Yu B (2003) Boosting with the \(L_2\) loss: regression and classification. J Am Stat Assoc 98: 324–338Google Scholar
  8. de Boor C (1978) A practical guide to splines. Springer, New YorkCrossRefMATHGoogle Scholar
  9. Efron B, Hastie T, Johnstone L, Tibshirani R (2004) Least angle regression. Ann Stat 32:407–499CrossRefMATHMathSciNetGoogle Scholar
  10. Eilers PHC, Marx BD (1996) Flexible smoothing with B-splines and penalties (with discussion). Stat Sci 11:89–121CrossRefMATHMathSciNetGoogle Scholar
  11. Fan J, Lv J (2010) A selective overview of variable selection in high dimensional feature space. Statistica Sinica 20:101–148MATHMathSciNetGoogle Scholar
  12. Fenske N, Kneib T, Hothorn T (2011) Identifying risk factors for severe childhood malnutrition by boosting additive quantile regression. J Am Stat Assoc 106(494):494–510CrossRefMATHMathSciNetGoogle Scholar
  13. Freund Y, Schapire R (1996) Experiments with a new boosting algorithm. In: Proceedings of the thirteenth international conference on machine learning theory. Morgan Kaufmann, San Francisco, pp 148–156Google Scholar
  14. Friedman JH (2001) Greedy function approximation: a gradient boosting machine. Ann Stat 29:1189–1232CrossRefMATHGoogle Scholar
  15. Friedman JH, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting (with discussion). Ann Stat 28:337–407CrossRefMATHMathSciNetGoogle Scholar
  16. Garcia AL, Wagner K, Hothorn T, Koebnick C, Zunft HJF, Tippo U (2005) Improved prediction of body fat by measuring skinfold thickness, circumferences, and bone breadths. Obes Res 13(3):626–634CrossRefGoogle Scholar
  17. Hastie T (2007) Comment: Boosting algorithms: regularization, prediction and model fitting. Stat Sci 22:513–515CrossRefMATHMathSciNetGoogle Scholar
  18. Hastie T, Tibshirani R (1990) Generalized additive models. Chapman & Hall, LondonMATHGoogle Scholar
  19. Hastie T, Tibshirani R, Friedman J (2009) The elements of statistical learning: data mining, inference, and prediction, 2nd edn. Springer, New YorkCrossRefGoogle Scholar
  20. Hofner B (2011) Boosting in structured additive models. PhD thesis, Department of Statistics, Ludwig-Maximilians-Universität München, MunichGoogle Scholar
  21. Hofner B, Hothorn T, Kneib T, Schmid M (2011a) A framework for unbiased model selection based on boosting. J Comput Graph Stat 20:956–971CrossRefMathSciNetGoogle Scholar
  22. Hofner B, Müller J, Hothorn T (2011b) Monotonicity-constrained species distribution models. Ecology 92:1895–1901CrossRefGoogle Scholar
  23. Hothorn T, Hornik K, Zeileis A (2006) Unbiased recursive partitioning: a conditional inference framework. J Comput Graph Stat 15:651–674CrossRefMathSciNetGoogle Scholar
  24. Hothorn T, Bühlmann P, Kneib T, Schmid M, Hofner B (2010) Model-based boosting 2.0. J Mach Learn Res 11:2109–2113MATHMathSciNetGoogle Scholar
  25. Hothorn T, Bühlmann P, Kneib T, Schmid M, Hofner B (2012) mboost: model-based boosting. http://CRAN.R-project.org/package=mboost, R package version 2.1-3
  26. Kneib T, Hothorn T, Tutz G (2009) Variable selection and model choice in geoadditive regression models. Biometrics 65:626–634. Web appendix accessed at http://www.biometrics.tibs.org/datasets/071127P.htm on 16 Apr 2012Google Scholar
  27. Koenker R (2005) Quantile regression. Cambridge University Press, New YorkCrossRefMATHGoogle Scholar
  28. Mayr A, Fenske N, Hofner B, Kneib T, Schmid M (2012a) Generalized additive models for location, scale and shape for high-dimensional data—a flexible approach based on boosting. J R Stat Soc Ser C (Appl Stat) 61(3):403–427CrossRefMathSciNetGoogle Scholar
  29. Mayr A, Hofner B, Schmid M (2012b) The importance of knowing when to stop—a sequential stopping rule for component-wise gradient boosting. Methods Inf Med 51(2):178–186CrossRefGoogle Scholar
  30. Mayr A, Hothorn T, Fenske N (2012c) Prediction intervals for future BMI values of individual children—a non-parametric approach by quantile boosting. BMC Med Res Methodol 12(6):1–13Google Scholar
  31. McCullagh P, Nelder JA (1989) Generalized linear models, 2nd edn. Chapman & Hall, LondonCrossRefMATHGoogle Scholar
  32. Meinshausen N (2006) Quantile regression forests. J Mach Learn Res 7:983–999MATHMathSciNetGoogle Scholar
  33. Pinheiro J, Bates D (2000) Mixed-effects models in S and S-PLUS. Springer, New YorkCrossRefMATHGoogle Scholar
  34. Pinheiro J, Bates D, DebRoy S, Sarkar D, R Development Core Team (2012) nlme: linear and nonlinear mixed effects models. http://CRAN.R-project.org/package=nlme, R package version 3.1-103
  35. R Development Core Team (2012) R: a language and Environment for statistical computing. R Foundation for Statistical Computing, Vienna. http://www.R-project.org, ISBN 3-900051-07-0
  36. Ridgeway G (2010) gbm: generalized boosted regression models. http://CRAN.R-project.org/package=gbm, R package version 1.6-3.1
  37. Schmid M, Hothorn T (2008a) Boosting additive models using component-wise P-splines. Comput Stat Data Anal 53:298–311CrossRefMATHMathSciNetGoogle Scholar
  38. Schmid M, Hothorn T (2008b) Flexible boosting of accelerated failure time models. BMC Bioinform 9:269CrossRefGoogle Scholar
  39. Schmid M, Potapov S, Pfahlberg A, Hothorn T (2010) Estimation and regularization techniques for regression models with multidimensional prediction functions. Stat Comput 20:139–150CrossRefMathSciNetGoogle Scholar
  40. Schmid M, Hothorn T, Maloney KO, Weller DE, Potapov S (2011) Geoadditive regression modeling of stream biological condition. Environ Ecol Stat 18(4):709–733CrossRefMathSciNetGoogle Scholar
  41. Sobotka F, Kneib T (2010) Geoadditive expectile regression. Comput Stat Data Anal 56(4):755–767CrossRefMathSciNetGoogle Scholar
  42. Tierney L, Rossini AJ, Li N, Sevcikova H (2011) snow: simple network of workstations. http://CRAN.R-project.org/package=snow, R package version 0.3-7
  43. Urbanek S (2011) multicore: parallel processing of R code on machines with multiple cores or CPUs. http://CRAN.R-project.org/package=multicore, R package version 0.1-7

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Benjamin Hofner
    • 1
  • Andreas Mayr
    • 1
  • Nikolay Robinzonov
    • 2
  • Matthias Schmid
    • 1
  1. 1.Department of Medical Informatics, Biometry and EpidemiologyFriedrich-Alexander-Universität Erlangen-NürnbergErlangenGermany
  2. 2.Department of StatisticsLudwig-Maximilians-Universität MünchenMunichGermany

Personalised recommendations