Skip to main content

Weighted Bagging for Graph Based One-Class Classifiers

  • Conference paper
Multiple Classifier Systems (MCS 2010)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5997))

Included in the following conference series:

Abstract

Most conventional learning algorithms require both positive and negative training data for achieving accurate classification results. However, the problem of learning classifiers from only positive data arises in many applications where negative data are too costly, difficult to obtain, or not available at all. Minimum Spanning Tree Class Descriptor (MST_CD) was presented as a method that achieves better accuracies than other one-class classifiers in high dimensional data. However, the presence of outliers in the target class severely harms the performance of this classifier. In this paper we propose two bagging strategies for MST_CD that reduce the influence of outliers in training data. We show the improved performance on both real and artificially contaminated data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tax, D.M.J.: One-class classification. PhD thesis, Delft. University of Technology (2001)

    Google Scholar 

  2. Bishop, C.M.: Neural Networks for Pattern Recognition, 1st edn. Oxford University Press, USA (1996)

    MATH  Google Scholar 

  3. Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. John Wiley & Sons Inc., Chichester (1973)

    MATH  Google Scholar 

  4. Parzen, E., Hoti, F.: On estimation of a probability density function and mode (1962)

    Google Scholar 

  5. Ypma, A., Ypma, E., Duin, R.P.: Support objects for domain approximation. In: ICANN 1998, Skovde, Sweden, pp. 2–4. Springer, Heidelberg (1998)

    Google Scholar 

  6. Schölkopf, B., Platt, J.C., Shawe-Taylor, J., Smola, A.J., Williamson, R.C.: Estimating the support of a high-dimensional distribution. Neural Computation 13, 2001 (1999)

    Google Scholar 

  7. Kohonen, T., Schroeder, M.R., Huang, T.S. (eds.): Self-Organizing Maps. Springer-Verlag New York, Inc., Secaucus (2001)

    MATH  Google Scholar 

  8. Juszczak, P., Tax, D.M.J., Pekalska, E., Duin, R.P.W.: Minimum spanning tree based one-class classifier. Neurocomput. 72(7-9), 1859–1869 (2009)

    Article  Google Scholar 

  9. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley-Interscience, Hoboken (2004)

    Book  MATH  Google Scholar 

  10. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  11. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants (1998)

    Google Scholar 

  12. Shieh, A.D., Kamm, D.F.: Ensembles of one class support vector machines. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds.) MCS 2009. LNCS, vol. 5519, pp. 181–190. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  13. Graham, R.L., Hell, P.: On the history of the minimum spanning tree problem. IEEE Ann. Hist. Comput. 7(1), 43–57 (1985)

    Article  MATH  MathSciNet  Google Scholar 

  14. Prim, R.C.: Shortest connection networks and some generalizations. Bell System Technology Journal 36, 1389–1401 (1957)

    Google Scholar 

  15. Kruskal, J.B.: On the shortest spanning subtree of a graph and the traveling salesman problem. Proceedings of the American Mathematical Society 7(1), 48–50 (1956)

    Article  MathSciNet  Google Scholar 

  16. Dietterich, T.G.: Ensemble methods in machine learning, pp. 1–15. Springer, Heidelberg (2000)

    Google Scholar 

  17. Marzio, M.D.: Boosting kernel density estimates: A bias reduction technique? Biometrika 91(1), 226–233 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  18. Bradley, A.P.: The use of the area under the roc curve in the evaluation of machine learning algorithms. Pattern Recognition 30(7), 1145–1159 (1997)

    Article  Google Scholar 

  19. Tax, D.: Ddtools, the data description toolbox for matlab (December 2009), version 1.7.3

    Google Scholar 

  20. Asuncion, A., Newman, D.J.: UCI machine learning repository (2007)

    Google Scholar 

  21. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Seguí, S., Igual, L., Vitrià, J. (2010). Weighted Bagging for Graph Based One-Class Classifiers. In: El Gayar, N., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2010. Lecture Notes in Computer Science, vol 5997. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12127-2_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-12127-2_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-12126-5

  • Online ISBN: 978-3-642-12127-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics