Skip to main content

A Discussion on the Classifier Projection Space for Classifier Combining

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2364))

Abstract

In classifier combining, one tries to fuse the information that is given by a set of base classifiers. In such a process, one of the difficulties is how to deal with the variability between classifiers. Although various measures and many combining rules have been suggested in the past, the problem of constructing optimal combiners is still heavily studied.

In this paper, we discuss and illustrate the possibilities of classifier embedding in order to analyse the variability of base classifiers, as well as their combining rules. Thereby, a space is constructed in which classifiers can be represented as points. Such a space of a low dimensionality is a Classifier Projection Space (CPS). In the first instance, it is used to design a visual tool that gives more insight into the differences of various combining techniques. This is illustrated by some examples. In the end, we discuss how the CPS may also be used as a basis for constructing new combining rules.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C.L. Blake and C.J. Merz. UCI repository of machine learning databases, 1998.

    Google Scholar 

  2. L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.

    MATH  MathSciNet  Google Scholar 

  3. D. Cho and D.J. Miller. A Low-complexity Multidimensional Scaling Method Based on Clustering. concept paper, 2002.

    Google Scholar 

  4. T.F. Cox and M.A.A. Cox. Multidimensional Scaling. Chapman & Hall, 1995.

    Google Scholar 

  5. Y. Freund and R. E. Schapire. Experiments with a new boosting algorithm. In Machine Learning: Proc. of the 13th International Conference, pages 148–156, 1996.

    Google Scholar 

  6. L. Goldfarb. A new approach to pattern recognition. In L.N. Kanal and A. Rosenfeld, editors, Progress in Pattern Recognition, volume 2, pages 241–402. Elsevier Science Publishers B.V., 1985.

    Google Scholar 

  7. T.K. Ho. The random subspace method for constructing decision forests. IEEE Trans. on PAMI, 20(8):832–844, 1998.

    Google Scholar 

  8. J. Kittler, M. Hatef, R.P.W. Duin, and J. Matas. On combining classifiers. IEEE Trans. on PAMI, 20(3):226–239, 1998.

    Google Scholar 

  9. L.I. Kuncheva and C.J. Whitaker. Measures of diversity in classifier ensembles. submitted, 2002.

    Google Scholar 

  10. C. Lai, D.M.J. Tax, R.p.W. Duin, P. Paclík, and E. Pękalska. On combining one-class classifiers for image database retrieval. In International Workshop on Multiple Classifier Systems, Cagliari, Sardinia, 2002.

    Google Scholar 

  11. L. Lam. Classifier combinations: implementation and theoretical issues. In Multiple Classifier Systems, LNCS, volume 1857, pages 78–86, 2000.

    Chapter  Google Scholar 

  12. MFEAT: ftp://ftp.ics.uci.edu/pub/machine-learning-databases/mfeat/.

  13. E. Pękalska and R.P.W. Duin. Spatial representation of dissimilarity data via lower-complexity linear and nonlinear mappings. In Joint International Workshop on SSPR and SPR, Windsor, Canada, 2002.

    Google Scholar 

  14. E. Pękalska, P. Paclík, and R.P.W. Duin. A Generalized Kernel Approach to Dissimilarity Based Classification. Journal of Mach. Learn. Research, 2:175–211, 2001.

    Article  Google Scholar 

  15. M. Skurichina. Stabilizing Weak Classifiers. PhD thesis, Delft University of Technology, Delft, The Netherlands, 2001.

    Google Scholar 

  16. D.J.M. Tax. One-class classifiers. PhD thesis, Delft University of Technology, Delft, The Netherlands, 2001.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pękalska, E., Duin, R.P.W., Skurichina, M. (2002). A Discussion on the Classifier Projection Space for Classifier Combining. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_14

Download citation

  • DOI: https://doi.org/10.1007/3-540-45428-4_14

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43818-2

  • Online ISBN: 978-3-540-45428-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics