A classification learning algorithm robust to irrelevant features

  • H. Altay Güvenir
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1480)


Presence of irrelevant features is a fact of life in many real-world applications of classification learning. Although nearest-neighbor classification algorithms have emerged as a promising approach to machine learning tasks with their high predictive accuracy, they are adversely affected by the presence of such irrelevant features. In this paper, we describe a recently proposed classification algorithm called VFI5, which achieves comparable accuracy to nearest-neighbor classifiers while it is robust with respect to irrelevant features. The paper compares both the nearest-neighbor classifier and the VFI5 algorithms in the presence of irrelevant features on both artificially generated and real-world data sets selected from the UCI repository.


Feature Selection Irrelevant Feature Average Classification Accuracy Vote Mechanism Point Interval 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Aha, D., Kibler, D., Albert, M.: Instance-based Learning Algorithms. Machine Learning. 6 (1991) 37–66Google Scholar
  2. 2.
    Almuallim, H., Dietterich, T.G.: Learning with many irrelevant features. In: Proceedings of the 9th National Conference on Artificial Intelligence: AAAI Press, Menlo Park (1991) 547–552Google Scholar
  3. 3.
    Cardie, C.: Automating Feature Set Selection for Case-Based Learning of Linguistic Knowledge. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing, University of Pennsylvania (1996) 113–126Google Scholar
  4. 4.
    Christopher, J.M., Murphy, P.M.: UCI repository of machine learning databases. At (1998)Google Scholar
  5. 5.
    Demiröz, G.: Non-Incremental Classification Learning Algorithms based on Voting Feature Intervals. MSc. Thesis. Bilkent University, Dept. of Computer Engineering and Information Science. Ankara, Turkey (1997)Google Scholar
  6. 6.
    Demiröz, G., Güvenir, H.A., İlter, N.: Differential Diagnosis of Erythemato-Squamous Diseases using Voting Feature Intervals. In: Ciftcibasi, T., Karaman, M., Atalay, V. (Eds.): New Trends in Artificial Intelligence and Neural Networks (TAINN'97), Kizilcahamam, Turkey, (May 22–23, 1997), 190–194Google Scholar
  7. 7.
    Demiröz, G., Güvenir, H.A.: Classification by Voting Feature Intervals. In: van Someren, M., Widmer, G. (Eds.): Machine Learning: ECML-97. Lecture Notes in Computer Science, Vol. 1224. Springer-Verlag, Berlin (1997) 85–92Google Scholar
  8. 8.
    Domingos, P.: Context-sensitive feature selection for lazy learners. Artificial Intelligence Review 11 (1997) 227–253CrossRefGoogle Scholar
  9. 9.
    Güvenir, H.A., Acar, B., Demiröz, G., Çekin, A.: A Supervised Machine Learning Algorithm for Arrhythmia Analysis. In: Computers in Cardiology 1997, 24 Lund, Sweden (1997) 433–436Google Scholar
  10. 10.
    Güvenir, H.A., AkkuŞ, A.: Weighted K Nearest Neighbor Classification on Feature Projections. In: Kuru, S., Çağlayan, M.U., Akin, H.L. (Eds.): Proceedings of the Twelfth International Symposium on Computer and Information Sciences (ISCIS XII). Antalya, Turkey (1997) 44–51Google Scholar
  11. 11.
    Güvenir, H.A., Şirin, İ.: Classification by Feature Partitioning. Machine Learning 23 (1996) 47–67Google Scholar
  12. 12.
    Kohavi, R., Langley, P., Yun, Y.: The Utility of Feature Weighting in Nearest-Neighbor Algorithms. In: van Someren, M., Widmer, G. (Eds.): Machine Learning: ECML-97. Lecture Notes in Computer Science, Vol. 1224. Springer-Verlag, Berlin (1997) 85–92Google Scholar
  13. 13.
    Langley, P.: Selection of Relevant Features in Machine Learning. In: Proceedings of the AAAI Fall Symposium on Relevance. New Orleans, USA, AAAI Press, (1994)Google Scholar
  14. 14.
    Liu, H., Setiono, R.: A probabilistic approach to feature selection — A filter solution. In: Saitta, L. (Ed.): Proceedings of the Thirteenth International Conference on Machine Learning (ICML'96) Italy (1996) 319–327Google Scholar
  15. 15.
    Skalak, D.: Prototype and feature selection by sampling and random mutation hill climbing algorithms. In: Proceedings of the Eleventh International Machine Learning Conference (ICML-94). Morgan Kauffmann, New Brunswick (1994) 293–301Google Scholar
  16. 16.
    Wettschereck, D., Aha, D.W., Mohri, T.: Review and Empirical Evaluation of Feature Weighting Methods for a Class of Lazy Learning Algorithms. Artificial Intelligence Review 11 (1997) 273–314.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1998

Authors and Affiliations

  • H. Altay Güvenir
    • 1
  1. 1.Department of Computer Engineering and Information ScienceBilkent UniversityAnkaraTurkey

Personalised recommendations