Advertisement

Probabilistic Feature Selection in Machine Learning

  • Indrajit GhoshEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10841)

Abstract

In machine learning, Case Based Reasoning is a prominent technique for harvesting knowledge from past experiences. The past experiences are represented in the form of a repository of cases having a set of features. But each feature may not have the equal relevancy in describing a case. Measuring the relevancy of each feature is always a prime issue. A subset of relevant features describes a case with adequate accuracy. An appropriate subset of relevant features should be selected for improving the performance of the system and to reduce dimensionality. In case based domain, feature selection is a process of selecting an appropriate subset of relevant features. There are various real domains which are inherently case based and features are expressed in terms of linguistic variables. To assign a numerical weight to each linguistic feature, a lot of feature subset selection algorithms have been proposed. But the weighting values are usually determined using subjective judgement or a trial and error basis.

This work presents an alternative concept in this direction. It can be efficiently applied to select the relevant linguistic features by measuring the probability in term of numerical values. It can also rule out irrelevant and noisy features. Applications of this approach in various real world domain show an excellent performance.

Keywords

Probabilistic feature selection Machine learning Case Based Reasoning 

References

  1. 1.
    Payne, T.R., Edwards, P.: Implicit feature selection with the value difference metric. In: Prade, H. (ed.) 13th European Conference on Artificial Intelligence (ECAI-1998). Wiley, Hoboken (1998)Google Scholar
  2. 2.
    Minsky, M., Papert, S.: Perceptrons. The MIT Press, Cambridge (1988)zbMATHGoogle Scholar
  3. 3.
    Roy, A.: Summery of panel discussion at ICNN97 on connectionist learning. Connectionist learning: is it time to reconsider the foundations. JNNS Newsl. Neural Netw. 11(2) (1998)Google Scholar
  4. 4.
    Roy, A.: Artificial neural networks - a science in trouble. Vivek Q. Artif. Intell. 13(2), 17–24 (2000)Google Scholar
  5. 5.
    Aha, D.W., Bankart, R.L.: Feature selection for case-based classification of cloud types: an empirical comparison. In: Workshop on Case-Based Reasoning, Technical Report WS-94-01. AAAI Press (1994)Google Scholar
  6. 6.
    Nguyen, H.V., Gopalkrishnan, V.: Feature extraction for outlier detection in high-dimensional spaces. In: The 4th Workshop on Feature Selection in Data Mining (2010)Google Scholar
  7. 7.
    Das, M., Liu, H.: Feature selection for classification. Intell. Data Anal.: Int. J. 1(3), 131–156 (1997)CrossRefGoogle Scholar
  8. 8.
    Blum, A., Langley, P.: Selection of relevant features and examples in machine learning. Artif. Intell. 97(1–2), 245–271 (1997)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)zbMATHGoogle Scholar
  10. 10.
    Liu, H., Motoda, H.: A selective sampling approach to active feature selection. Artif. Intell. 159, 49–74 (2004)MathSciNetCrossRefGoogle Scholar
  11. 11.
    John, G., Kohavi, R., Pfleger, K.: Irrelevant feature and subset selection problem. In: Proceedings of the Eleventh International Machine Learning Conference, pp. 121–129 (1994)Google Scholar
  12. 12.
    Vafaie, H., De Jong, K.: Robust feature selection algorithms. In: Proceeding of the Fifth Conference on Tools for Artificial Intelligence, pp. 356–363 (1993)Google Scholar
  13. 13.
    Song, L., Smola, A., Gretoon, A., Borgwardt, K., Bedo, J.: Supervised feature selection via dependence estimation. In: International Conference on Machine Learning (2007)Google Scholar
  14. 14.
    Xu, Z., Jin, R., Ye, J., Lyu, M.R., King, I.: Discriminative semi-supervised feature selection via manifold regularization. In: Proceedings of the 21th International Joint Conference on Artificial Intelligence, IJCAI 2009 (2009)Google Scholar
  15. 15.
    Zhao, Z., Liu, H.: Spectral feature selection for supervised and unsupervised learning. In: International Conference on Machine Learning, ICML-2007 (2007)Google Scholar
  16. 16.
    Zhao, Z., Wang, L., Liu, H.: Efficient spectral feature selection with minimum redundancy. In: Proceedings of the 24th AAAI Conference on Artificial Intelligence, AAAI-2010 (2010)Google Scholar
  17. 17.
    Yu, L., Liu, H.: Redundancy based feature selection for microarray data. In: Proceedings of the Tenth ACM SIGKDD Conference on Knowledge Discovery and Data Mining, ACM SIGKDD-2004 (2004)Google Scholar
  18. 18.
    Queiros, C.E., Gelsema, E.: On feature selection. In: Proceedings of the Seventh International Conference on Pattern Recognition, pp. 128–130 (1984)Google Scholar
  19. 19.
    Chang, S., Dasgupta, N., Carin, L.: A Bayesian approach to unsupervised feature selection and density estimation using expectation propagation. In: IEEE Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 1043–1050 (2005)Google Scholar
  20. 20.
    Dy, J.G., Brodley, C.E.: Feature selection for unsupervised learning. J. Mach. Learn. Res. 5, 845–889 (2004)MathSciNetzbMATHGoogle Scholar
  21. 21.
    He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: Advances in Neural Information Processing Systems 18, pp. 507–514. MIT Press (2006)Google Scholar
  22. 22.
    Piramuthu, S.: Evaluating feature selection methods for learning in data mining application. Eur. J. Oper. Res. 156(2), 483–494 (2004)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Das, S.: Filters, wrappers and a boosting-based hybrid for feature selection. In: Proceedings of the Eighteenth International Conference on Machine Learning, pp. 74–81 (2001)Google Scholar
  24. 24.
    Kumar, V., Minz, S.: Feature selection: a literature review. Smart Comput. Rev. 4(3), 211–229 (2014)CrossRefGoogle Scholar
  25. 25.
    Tang, J., Alelyani, S., Liu, H.: Feature selection for classification: a review. In: Data Classification: Algorithms and Applications. CRC Press (2013)Google Scholar
  26. 26.
    Molina, L.C., Belanche, L., Nebot, A.: Feature selection algorithms: a survey and experimental evaluation. In: Proceedings of ICDM, pp. 306–313 (2002)Google Scholar
  27. 27.
    Dadaneh, B.Z., Markid, H.Y., Zakerolhosseini, A.: Unsupervised probabilistic feature selection using ant colony optimization. Expert Syst. Appl. 53, 27–42 (2016)CrossRefGoogle Scholar
  28. 28.
    Uysal, A.K., Gunal, S.: A novel probabilistic feature selection method for text classification. Knowl.-Based Syst. 36, 226–235 (2012)CrossRefGoogle Scholar
  29. 29.
    Alibeigi, M., Hashemi, S., Hamzeh, A.: Unsupervised feature selection using feature density functions. Int. J. Comput. Electr. Autom. Control Inf. Eng. 3(3), 847–852 (2009)Google Scholar
  30. 30.
    Kohavi, R.: Feature subset selection as search with probabilistic estimates. In: AAAI Fall Symposium on Relevance, pp. 122–126 (1994)Google Scholar
  31. 31.
    Salehi, E., Nayachavadi, J., Gras, R.: A statistical implicative analysis based algorithm and MMPC algorithm for detecting multiple dependencies. In: The 4th Workshop on Feature Selection in Data Mining (2010)Google Scholar
  32. 32.
    Liu, H., Setiono, R.: Feature selection and classification - a probabilistic wrapper approach. In: Proceedings of the Ninth International Conference on Industrial and Engineering Application of AI and ES, pp. 419–424 (1996)Google Scholar
  33. 33.
    Pearl, J.: Probabilistic Reasoning in Intelligent Systems. Morgan Kaufmann, San Mateo (1988)zbMATHGoogle Scholar
  34. 34.
    Agresti, A.: Categorical Data Analysis: Probability and Mathematical Statistics. Wiley, Hoboken (1990)zbMATHGoogle Scholar
  35. 35.
    Kherfi, M.L., Ziou, D.: Relevance feedback for CBIR: a new approach based on probabilistic feature weighting with positive and negative examples. IEEE Trans. Image Process. 15(4), 1017–1030 (2006)CrossRefGoogle Scholar
  36. 36.
    Bolon-Canedo, V., Sanchez-Marono, N., Alonso-Betanzos, A.: A review of feature selection methods on synthetic data. Knowl. Inf. Syst. 34(3), 483–519 (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Agro-Computing Research Laboratory, Department of Computer ScienceAnanda Chandra CollegeJalpaiguriIndia

Personalised recommendations