Advertisement

International Journal of Fuzzy Systems

, Volume 21, Issue 2, pp 639–654 | Cite as

Prototypes Reduction and Feature Selection based on Fuzzy Boundary Area for Nearest Neighbor Classifiers

  • Tae-Chon AhnEmail author
  • Seok-Beom Roh
  • Yong Soo Kim
  • Jihong Wang
Article
  • 20 Downloads

Abstract

For prototype-based classifiers, the number of prototypes results in increasing the computational time so that it takes very long time for a prototype-based classifier to determine the class label of an associated data. Many researchers have been interested in the reduction of the number of prototypes without degradation of the classification ability of prototype-based classifiers. In this paper, we introduce a new method for generating prototypes based on the assumption that the prototypes positioned near the boundary surface are important for improving the classification abilities of nearest neighbor classifiers. The main issue of this paper is how to locate the new prototypes as close as possible to the boundary surface. To realize this, we consider possibilistic C-Means clustering and conditional C-Means clustering. The clusters obtained by using possibilistic C-Means clustering methods are used to define the boundary areas, and the conditional fuzzy C-Means clustering technique is used to determine the locations of prototypes within the already defined boundary areas. The design procedure is illustrated with the aid of numeric examples that provide a thorough insight into the effectiveness of the proposed method.

Keywords

Prototype-based classifier Possibilistic C-means clustering Prototype reduction Conditional C-means clustering Boundary area definition Nearest neighbor classifier 

Notes

Acknowledgement

This paper was supported by Wonkwang University in 2017.

References

  1. 1.
    Loftsgaarden, D.O., Quesenberry, C.P.: A nonparametric estimate of a multivariate density function. Ann. Math. Stat. 36(3), 1049–1051 (1965)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Information Theory IT 13(1), 21–27 (1967)CrossRefzbMATHGoogle Scholar
  3. 3.
    Hardle, W.: Applied Nonparametric Regression. Cambridge University Press, Cambridge (1990)CrossRefzbMATHGoogle Scholar
  4. 4.
    Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press (1991)Google Scholar
  5. 5.
    Maron, O., Ratan, A.L.: Multiple-instance learning for natural scene classification. In: Proceedings of the 15th International Conference On Machine Learning, Morgan Kaufmann, San Francisco, pp. 341–349 (1998)Google Scholar
  6. 6.
    Kim, D.-E., Yu, J.-H., & Sim, K.-B.: EEG feature classification based on grip strength for BCI applications, International Journal of Fuzzy Logic and Intelligent Systems, 15(4), 277-282 (2015)Google Scholar
  7. 7.
    Jung, H., Chung, Y.D., Liu, L.: Processing generalized k-nearest queries on a wireless broadcast stream. Inf. Sci. 188, 64–79 (2012)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Lai, J.Z.C., Huang, T.J.: An agglomerative clustering algorithm using a dynamic k-nearest-neighbor list. Inf. Sci. 181, 1722–1734 (2011)CrossRefGoogle Scholar
  9. 9.
    Li, B., Chen, Y.W., Chen, Y.Q.: The nearest neighbor algorithm of local probability centers. IEEE Trans. Syst. Man Cybern. Part B 38(1), 141–154 (2008)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Sierra, B., Lazkano, E., Irigoien, I., Jauregi, E., Mendialdua, I.: K Nearest neighbor equality: giving equal chance to all existing class. Inf. Sci. 181, 5158–5168 (2011)CrossRefGoogle Scholar
  11. 11.
    Singh, P., Verma, S., Vyas, O.P.: Software fault prediction at design phase. J. Electr. Eng. Technol. 9(5), 1739–1745 (2014)CrossRefGoogle Scholar
  12. 12.
    Lam, W., Keung, C.K., Ling, C.X.: Learning good prototypes for classification using filtering and abstraction of instances. Pattern Recogn. 35(7), 1491–1506 (2002)CrossRefzbMATHGoogle Scholar
  13. 13.
    Sproull, R.F.: Refinements to nearest-neighbor searching in k-dimensional tree. Algorithmica 6, 579–589 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Kim, S.W., Oommen, B.J.: Enhancing prototype reduction schemes with LVQ3-type algorithms. Pattern Recogn. 36(5), 1083–1093 (2003)CrossRefzbMATHGoogle Scholar
  15. 15.
    Pedrycz, W.: Conditional fuzzy C-means. Pattern Recogn. Lett. 17(6), 626–632 (1996)Google Scholar
  16. 16.
    Hart, P.E.: The condensed nearest neighbor rule. IEEE Trans. Inf. Theory IT 14(3), 515–516 (1968)CrossRefGoogle Scholar
  17. 17.
    Gates, G.W.: The reduced nearest neighbor rule. IEEE Trans. Inf. Theory IT 18(3), 431–433 (1972)CrossRefGoogle Scholar
  18. 18.
    Chang, C.L.: Finding prototypes for nearest neighbor classifiers. IEEE Trans. Comput. 23(11), 1179–1184 (1974)CrossRefzbMATHGoogle Scholar
  19. 19.
    Ritter, G.L., Woodruff, H.B., Lowery, S.R., Isenhour, T.L.: An algorithm for a selective nearest neighbor rule, IEEE Trans. Inf. Theory IT 21, 665–669 (1975)CrossRefzbMATHGoogle Scholar
  20. 20.
    Tomek, I.: Two modifications of CNN. IEEE Trans. Syst. Man Cybern. SMC 6(6), 769–772 (1976)MathSciNetzbMATHGoogle Scholar
  21. 21.
    Devijver, P.A., Kittler, J.: On the edited nearest neighbor rule. In: Proceedings of the Fifth International Conference on Pattern Recognition, IEEE Computer Society, Miami, pp. 72–80 (1984)Google Scholar
  22. 22.
    Fukunaga, K., Mantock, J.M.: Nonparametric data reduction. IEEE Trans. Pattern Anal. Mach. Intell. PAMI 6(1), 115–118 (1984)CrossRefGoogle Scholar
  23. 23.
    Kim, S.W., Oommen, B.J.: On using prototype reduction schemes to optimize kernel-based Fisher discriminant analysis. IEEE Trans. Syst. Man Cybern. Part B 38(2), 564–570 (2008)CrossRefGoogle Scholar
  24. 24.
    Cervantes, A., Galvan, I.M., Isasi, P.: AMPSO: a new particle swarm method for nearest neighborhood classification. IEEE Trans. Syst. Man Cybern. Part B 39(5), 1082–1091 (2009)CrossRefGoogle Scholar
  25. 25.
    Xie, Q., Laszlo, C.A., Ward, R.K.: Vector quantization techniques for nonparametric classifier design. IEEE Trans. Pattern Anal. Mach. Intell. 15(12), 1326–1330 (1993)CrossRefGoogle Scholar
  26. 26.
    Yang, J., Ma, Z., Xie, M.: Multiscale spatial position coding under locality constraint for action recognition. J. Electr. Eng. Technol. 10(4), 1851–1863 (2015)CrossRefGoogle Scholar
  27. 27.
    Wilson, D.L.: Asymptotic properties of nearest neighbor rules using edited data. IEEE Trans. Syst. Man Cybern. 2(3), 408–421 (1972)MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Tomek, I.: An experiment with the edited nearest neighbor. IEEE Trans. Syst. Man Cybern. 6(6), 448–452 (1976)MathSciNetzbMATHGoogle Scholar
  29. 29.
    Kuncheva, L.I.: Editing for the k-nearest neighbors rule by a genetic algorithm. Pattern Recogn. Lett. 16, 809–814 (1995)CrossRefGoogle Scholar
  30. 30.
    Garcia, S., Cano, J.R., Herrera, F.: A Memetic algorithm for evolutionary prototype selection: a scaling up approach. Pattern Recogn. 41(8), 2693–2709 (2008)CrossRefzbMATHGoogle Scholar
  31. 31.
    Parades, R., Vidal, E.: Learning prototypes and distances: a prototype reduction technique based on nearest neighbor error minimization. Pattern Recogn. 39, 180–188 (2006)CrossRefzbMATHGoogle Scholar
  32. 32.
    Song, Q., Yang, X., Soh, Y.C., Wang, Z.M.: An information-theoretic fuzzy C-spherical shells clustering algorithm. Fuzzy Sets Syst. 161(13), 1755–1773 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  33. 33.
    Huang, W., Oh, S.K., Pedrycz, W.: Fuzzy wavelet polynomial neural networks: analysis and design. IEEE Trans. Fuzzy Syst. 25(5), 1329–1341 (2017)CrossRefGoogle Scholar
  34. 34.
    Huang, W., Oh, S.K., Pedrycz, W.: Hybrid fuzzy wavelet neural networks architecture based on polynomial neural networks and fuzzy set/relation inference-based wavelet neurons. IEEE Trans. Neural Netw. Learn. Syst. 29(8), 3452–3462 (2018)CrossRefGoogle Scholar
  35. 35.
    Lee, S.Y., Urtnasan, E., & Lee, K.-J.: Design of a fast learning classifier for sleep apnea database based on fuzzy SVM, International Journal of Fuzzy Logic and Intelligent Systems, 17(3), 187-193 (2017)Google Scholar
  36. 36.
    Parades, R., Vidal, E.: Learning prototypes and distance: a prototype reduction technique based on nearest neighbor error minimization. Pattern Recogn. 39, 180–188 (2006)CrossRefzbMATHGoogle Scholar
  37. 37.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques. Elsevier, Amsterdam (2005)zbMATHGoogle Scholar

Copyright information

© Taiwan Fuzzy Systems Association and Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Electronics Convergence EngineeringWonkwang UniversityIksanKorea
  2. 2.Department of Electrical EngineeringUniversity of SuwonHwasungKorea
  3. 3.Department of Computer EngineeringDaejeon UniversityDaejeonSouth Korea
  4. 4.Electronics and Information EngineeringHengshui UniversityHengshuiChina

Personalised recommendations