Skip to main content

Tomographic Considerations in Ensemble Bias/Variance Decomposition

  • Conference paper
Multiple Classifier Systems (MCS 2010)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5997))

Included in the following conference series:

Abstract

Classifier decision fusion has been shown to act in a manner analogous to the back-projection of Radon transformations when individual classifier feature sets are non or partially overlapping. It is possible, via this analogy, to demonstrate that standard linear classifier fusion introduces a morphological bias into the decision space due to the implicit angular undersampling of the feature selection process. In standard image-based (eg medical) tomography, removal of this bias involves a filtration process, and an analogous n-dimensional processes can be shown to exist for decision fusion using Högbom deconvolution.

Countering the biasing process implicit in linear fusion, however, is the fact that back projection of Radon transformation (being additive) should act to reduce variance within the composite decision space. In principle, this additive variance-reduction should still apply to tomographically- filtered back-projection, unless the filtration process contravenes.

We therefore argue that when feature selection is carried-out independently for each classifier (as in e.g. multi-modal problems) unfiltered decision fusion, while in general being variance-decreasing, is typically also bias-increasing. By employing a shot noise model, we seek to quantify how far filtration acts to rectify this problem, such that feature selection can be made both bias and variance reducing within an ensemble fusion context.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brown, G., Wyatt, J.L., Tiňo, P.: Managing diversity in regression ensembles. J. Mach. Learn. Res. 6, 1621–1650 (2005)

    MathSciNet  Google Scholar 

  2. Friedman, J.H.: On bias, variance, 0/1-loss, and the curse-of-dimensionality. Data Min. Knowl. Discov. 1(1), 55–77 (1997)

    Article  Google Scholar 

  3. Munson, M.A., Caruana, R.: On feature selection, bias-variance, and bagging. In: ECML/PKDD, vol. (2), pp. 144–159 (2009)

    Google Scholar 

  4. Uedar, N., Nakano, R.: Generalization error of ensemble estimators. In: Proceedings of International Conference on Neural Networks, pp. 90–95 (1996)

    Google Scholar 

  5. Smith, R.S., Windeatt, T.: The bias variance trade-off in bootstrapped error correcting output code ensembles. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds.) MCS 2009. LNCS, vol. 5519, pp. 1–10. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  6. Suen, Y.L., Melville, P., Mooney, R.J.: Combining bias and variance reduction techniques for regression trees. In: Gama, J., Camacho, R., Brazdil, P.B., Jorge, A.M., Torgo, L. (eds.) ECML 2005. LNCS (LNAI), vol. 3720, pp. 741–749. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  7. Tumer, K., Ghosh, J.: Theoretical foundations of linear and order statistics combiners for neuralpattern classifiers. Technical Report TR-95-02-98, Computer and Vision Research Center, University of Texas, Austin (1995)

    Google Scholar 

  8. Valentini, G., Dietterich, T.G.: Bias-variance analysis of support vector machines for the development of svm-based ensemble methods. J. Mach. Learn. Res. 5, 725–775 (2004)

    MathSciNet  Google Scholar 

  9. Windridge, D., Kittler, J.: A morphologically optimal strategy for classifier combination: Multiple expert fusion as a tomographic process. IEEE Trans. Pattern Anal. Mach. Intell. 25(3), 343–353 (2003)

    Article  Google Scholar 

  10. Windridge, D., Kittler, J.: Performance measures of the tomographic classifier fusion methodology. Intern. Jrnl. of Pattern Recognition and Artificial Intelligence 19(6) (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Windridge, D. (2010). Tomographic Considerations in Ensemble Bias/Variance Decomposition. In: El Gayar, N., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2010. Lecture Notes in Computer Science, vol 5997. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12127-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-12127-2_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-12126-5

  • Online ISBN: 978-3-642-12127-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics