Advertisement

Feature Selection Using Distance from Classification Boundary and Monte Carlo Simulation

  • Yutaro Koyama
  • Kazushi Ikeda
  • Yuichi Sakumura
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11304)

Abstract

In binary classification, to improve the performance for unknown samples, excluding as many unnecessary features representing samples as possible is necessary. Of various methods of feature selection, the filter method calculates indices beforehand for each feature, and the wrapper method finds combinations of features having the maximum performance from all combinations of features. In this paper, we propose a novel feature selection method using distance from the classification boundary and a Monte Carlo simulation. Synthetic sample sets for binary classification were provided, and features determined by random numbers were added to each sample. For these sample sets, the conventional methods and the proposed method were applied, and it was examined whether the feature forming the boundary was selected. Our results demonstrate that feature selection was difficult with the conventional methods but possible with our proposed method.

Keywords

Feature selection Support vector machine Margin-based exploration Monte Carlo method 

References

  1. 1.
    John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem (1994)CrossRefGoogle Scholar
  2. 2.
    Almuallim, H., Dietterich, T.G.: Learning with many irrelevant features. In: AAAI, vol. 91 (1991)Google Scholar
  3. 3.
    Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. Artif. Intell. 97, 245–271 (1998)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40, 16–28 (2014)CrossRefGoogle Scholar
  5. 5.
    Vergara, J.R., Estévez, P.A.: A review of feature selection methods based on mutual information. Neural Comput. Appl. 24, 175–186 (2015)CrossRefGoogle Scholar
  6. 6.
    Li, Y., Li, T., Liu, H.: Recent advances in feature selection and its applications. Knowl. Inf. Syst. 53, 551–577 (2017)CrossRefGoogle Scholar
  7. 7.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)zbMATHGoogle Scholar
  8. 8.
    Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: a review. IEEE Trans. Pattern Anal. 22, 4–37 (2000)CrossRefGoogle Scholar
  9. 9.
    Shannon, C.: A mathematical theory of communication. ACM SIGMOBILE Mobile Comput. Commun. Rev. 5, 3–55 (2001)CrossRefGoogle Scholar
  10. 10.
    Bennasar, M., Hicks, Y., Setchi, R.: Feature selection using joint mutual information maximisation. Expert Syst. Appl. 42, 8520–8532 (2015)CrossRefGoogle Scholar
  11. 11.
    Zhao, G., Wu, Y., Chen, F., Zhang, J., Bai, J.: Effective feature selection using feature vector graph for classification. Neurocomputing 151, 376–389 (2015)CrossRefGoogle Scholar
  12. 12.
    Pes, B., Dessì, N., Angioni, M.: Exploiting the ensemble paradigm for stable feature selection: a case study on high-dimensional genomic data. Inf. Fusion 35, 132–147 (2017)CrossRefGoogle Scholar
  13. 13.
    Sánchez-Maroño, N., Alonso-Betanzos, A., Tombilla-Sanromán, M.: Filter methods for feature selection – a comparative study. In: Yin, H., Tino, P., Corchado, E., Byrne, W., Yao, X. (eds.) IDEAL 2007. LNCS, vol. 4881, pp. 178–187. Springer, Heidelberg (2007).  https://doi.org/10.1007/978-3-540-77226-2_19CrossRefGoogle Scholar
  14. 14.
    Mitchell, T.M.: Machine Learning, vol. 45. McGraw Hill, Burr Ridge (1997)zbMATHGoogle Scholar
  15. 15.
    Quinlan, J.: Induction of decision trees. Mach. Learn. 1, 81–106 (1986)Google Scholar
  16. 16.
    Kira, K., Rendell, L.A.: The feature selection problem: traditional methods and a new algorithm. In: AAAI, vol. 2 (1992)Google Scholar
  17. 17.
    Kononenko, I.: Estimating attributes: analysis and extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994).  https://doi.org/10.1007/3-540-57868-4_57CrossRefGoogle Scholar
  18. 18.
    Liu, H., Motoda, H., Yu, L.: Feature selection with selective sampling. In: ICML (2002)Google Scholar
  19. 19.
    Kira, K., Rendell, L.A.: A practical approach to feature selection (1992)Google Scholar
  20. 20.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97, 273–324 (1997)CrossRefGoogle Scholar
  21. 21.
    Panthong, R., Srivihok, A.: Wrapper feature subset selection for dimension reduction based on ensemble learning algorithm. Procedia Comput. Sci. 72, 162–169 (2015)CrossRefGoogle Scholar
  22. 22.
    Mi, H., Petitjean, C., Dubray, B., Vera, P., Ruan, S.: Robust feature selection to predict tumor treatment outcome. Artif. Intell. Med. 64, 195–204 (2015)CrossRefGoogle Scholar
  23. 23.
    Vapnik, V.: Pattern recognition using generalized portrait method. Autom. Remote Control. 24, 774–780 (1963)Google Scholar
  24. 24.
    Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers, pp. 144–152 (1992)Google Scholar
  25. 25.
    Aizerman, M.A.: Theoretical foundations of the potential function method in pattern recognition learning. Autom. Remote Control 25, 821–837 (1964)zbMATHGoogle Scholar
  26. 26.
    Buhmann, M.D.: Radial Basis Functions: Theory and Implementations, vol. 12. Cambridge University Press, Cambridge (2003)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Yutaro Koyama
    • 1
  • Kazushi Ikeda
    • 2
  • Yuichi Sakumura
    • 1
    • 2
  1. 1.Aichi Prefectural UniversityNagakuteJapan
  2. 2.Nara Institute of Science and TechnologyIkomaJapan

Personalised recommendations