Skip to main content

Practical Bias Variance Decomposition

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5360))

Abstract

Bias variance decomposition for classifiers is a useful tool in understanding classifier behavior. Unfortunately, the literature does not provide consistent guidelines on how to apply a bias variance decomposition. This paper examines the various parameters and variants of empirical bias variance decompositions through an extensive simulation study. Based on this study, we recommend to use ten fold cross validation as sampling method and take 100 samples within each fold with a test set size of at least 2000. Only if the learning algorithm is stable, fewer samples, a smaller test set size or lower number of folds may be justified.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kohavi, R., Wolpert, D.H.: Bias plus variance decomposition for zero-one loss functions. In: Saitta, L. (ed.) Machine Learning: Proceedings of the Thirteenth International Conference, pp. 275–283. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  2. Witten, I., Frank, E.: Data mining: Practical machine learning tools and techniques with Java implementations. Morgan Kaufmann, San Francisco (2000)

    Google Scholar 

  3. Kong, E.B., Dietterich, T.G.: Error-correcting output coding corrects bias and variance. In: Proceedings of the 12th International Conference on Machine Learning, pp. 313–321. Morgan Kaufmann, San Francisco (1995)

    Google Scholar 

  4. Domingos, P.: A unified bias-variance decomposition and its applications. In: International Conference on Machine Learning, ICML 2000, pp. 231–238 (2000)

    Google Scholar 

  5. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36(1-2), 105–139 (1999)

    Article  Google Scholar 

  6. James, G.: Variance and bias for general loss functions. Machine Learning 51, 115–135 (2003)

    Article  MATH  Google Scholar 

  7. Valentini, G., Dietterich, T.G.: Bias-variance analysis and ensembles of svm. In: Multiple Classifier Systems: Third International Workshop, pp. 222–231 (2002)

    Google Scholar 

  8. Valentini, G., Dietterich, T.G.: Low bias bagged support vector machines. In: International Conference on Machine Learning, ICML 2003 (2003)

    Google Scholar 

  9. Webb, G.I.: Multiboosting: A technique for combining boosting and wagging. Machine Learning 40(2), 159–196 (2000)

    Article  Google Scholar 

  10. Webb, G.I., Conilione, P.: Estimating bias and variance from data (unpublished manuscript) (2002), http://www.csse.monash.edu.au/~webb/files/webbconilione06.pdf

  11. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees, pp. 43–49. Wadsworth International Group, Belmont (1984)

    MATH  Google Scholar 

  12. Agrawal, R., Imielinski, T., Swami, A.: Database mining: A performance perspective. IEEE Transactions on Knowledge and Data Engineering 5(6), 914–925 (1993); Special issue on Learning and Discovery in Knowledge-Based Databases.

    Article  Google Scholar 

  13. Dwyer, K., Holte, R.C.: Decision tree instability and active learning. In: Kok, J.N., Koronacki, J., de Mántaras, R.L., Matwin, S., Mladenic, D., Skowron, A. (eds.) ECML 2007. LNCS (LNAI), vol. 4701, pp. 128–139. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bouckaert, R.R. (2008). Practical Bias Variance Decomposition. In: Wobcke, W., Zhang, M. (eds) AI 2008: Advances in Artificial Intelligence. AI 2008. Lecture Notes in Computer Science(), vol 5360. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-89378-3_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-89378-3_24

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-89377-6

  • Online ISBN: 978-3-540-89378-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics