Skip to main content

Neighborhood Random Classification

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7301))

Abstract

Ensemble methods (EMs) have become increasingly popular in data mining because of their efficiency. These methods(EMs) generate a set of classifiers using one or several machine learning algorithms (MLAs) and aggregate them into a single classifier (Meta-Classifier, MC). Of the MLAs, k-Nearest Neighbors (kNN) is one of the most well-known used in the context of EMs. However, handling the parameter k can be difficult. This drawback is the same for all MLA that are instance based. Here, we propose an approach based on neighborhood graphs as an alternative. Thanks to these related graphs, like relative neighborhood graphs (RNGs) or Gabriel graphs (GGs), we provide a generalized approach with less arbitrary parameters. Neighborhood graphs have never been introduced into EM approaches before. The results of our algorithm : Neighborhood Random Classification are very promising as they are equal to the best EM approaches such as Random Forest or those based on SVMs. In this exploratory and experimental work, we provide the methodological approach and many comparative results.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bias, variance, and arcing classifiers. Statistics (1996)

    Google Scholar 

  2. Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  3. Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: a survey and categorisation. Information Fusion 6(1), 5–20 (2005)

    Article  Google Scholar 

  4. Demsar, J.: Statistical comparisons of classifiers over multiple data sets (2006)

    Google Scholar 

  5. Domingos, P.: A unified bias-variance decomposition and its applications. In: ICML, pp. 231–238. Citeseer (2000)

    Google Scholar 

  6. Ham, J.S., Chen, Y., Crawford, M.M., Ghosh, J.: Investigation of the random forest framework for classification of hyperspectral data. IEEE Transactions on Geoscience and Remote Sensing 43(3) (2005)

    Google Scholar 

  7. Ho, T., Kleinberg, E.: Building projectable classifiers of arbitrary complexity. In: International Conference on Pattern Recognition, vol. 13, pp. 880–885 (1996)

    Google Scholar 

  8. Kohavi, R., Wolpert, D.: Bias plus variance decomposition for zero-one loss functions. In: Machine Learning-International Workshop, pp. 275–283. Citeseer (1996)

    Google Scholar 

  9. O’Mahony, M.P., Cunningham, P., Smyth, B.: An Assessment of Machine Learning Techniques for Review Recommendation. In: Coyle, L., Freyne, J. (eds.) AICS 2009. LNCS, vol. 6206, pp. 241–250. Springer, Heidelberg (2010), http://portal.acm.org/citation.cfm?id=1939047.1939075

    Chapter  Google Scholar 

  10. Park, J.C., Shin, H., Choi, B.K.: Elliptic Gabriel graph for finding neighbors in a point set and its application to normal vector estimation. Computer-Aided Design 38(6), 619–626 (2006)

    Article  Google Scholar 

  11. Planeta, D.S.: Linear time algorithms based on multilevel prefix tree for finding shortest path with positive weights and minimum spanning tree in a networks. CoRR abs/0708.3408 (2007)

    Google Scholar 

  12. Prasad, A.M., Iverson, L.R., Liaw, A.: Newer classification and regression tree techniques: bagging and random forests for ecological prediction. Ecosystems 9(2), 181–199 (2006)

    Article  Google Scholar 

  13. Preparata, F.P., Shamos, M.I.: Computational geometry: an introduction. Springer (1985)

    Google Scholar 

  14. Schapire, R.: The boosting approach to machine learning: An overview. Lecture Note in Statistics, pp. 149–172. Springer (2003)

    Google Scholar 

  15. Shipp, C.A., Kuncheva, L.I.: Relationships between combination methods and measures of diversity in combining classifiers. Information Fusion 3(2), 135–148 (2002)

    Article  Google Scholar 

  16. Toussaint, G.T.: The relative neighbourhood graph of a finite planar set. Pattern Recognition 12(4), 261–268 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  17. Wang, X., Tang, X.: Random sampling lda for face recognition, pp. 259–267 (2004), http://portal.acm.org/citation.cfm?id=1896300.1896337

  18. Zhou, Z.H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all* 1. Artificial Intelligence 137(1-2), 239–263 (2002)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zighed, D.A., Ezzeddine, D., Rico, F. (2012). Neighborhood Random Classification. In: Tan, PN., Chawla, S., Ho, C.K., Bailey, J. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2012. Lecture Notes in Computer Science(), vol 7301. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30217-6_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-30217-6_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-30216-9

  • Online ISBN: 978-3-642-30217-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics