A Prototype Selection Algorithm Using Fuzzy k-Important Nearest Neighbor Method

  • Zhen-Xing Zhang
  • Xue-Wei Tian
  • Sang-Hong Lee
  • Joon S. Lim
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 215)


The k-Nearest Neighbor (KNN) algorithm is widely used as a simple and effective classification algorithm. While its main advantage is its simplicity, its main shortcoming is its computational complexity for large training sets. A Prototype Selection (PS) method is used to optimize the efficiency of the algorithm so that the disadvantages can be overcome. This paper presents a new PS algorithm, namely Fuzzy k-Important Nearest Neighbor (FKINN) algorithm. In this algorithm, an important nearest neighbor selection rule is introduced. When classifying a data set with the FKINN algorithm, the most repeated selection sample is defined as an important nearest neighbor. To verify the performance of the algorithm, five UCI benchmarking databases are considered. Experiments show that the algorithm effectively deletes redundant or irrelevant prototypes while maintaining the same level of classification accuracy as that of the KNN algorithm.


k-nearest neighbor (KNN) Prototype selection (PS) Fuzzy k-important nearest neighbor (FKINN) 



This research was supported by the MKE (The Ministry of Knowledge Economy), Korea, under the Convergence-ITRC (Convergence Information Technology Research Center) support program (NIPA-2012-H0401-12-1001) supervised by the NIPA (National IT Industry Promotion Agency).

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2012R1A1A2044134).


  1. 1.
    Cover TM, Hart PE (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13:21–27CrossRefMATHGoogle Scholar
  2. 2.
    Wu Y, Ianakiev KG, Govindaraju V (2002) Improved K-nearest neighbor classification. Pattern Recogn 35(10):2311–2318CrossRefMATHGoogle Scholar
  3. 3.
    Sanchez JS, Barandela R, Marques AI, Alejo R (2003) Analysis of new techniques to obtain quality training sets. Pattern Recogn Lett 24(7):1015–1022CrossRefGoogle Scholar
  4. 4.
    Garcia S, Derrac J, Cano JR, Herrera F (2012) Prototype selection for nearest neighbor classification: taxonomy and empirical study. IEEE Trans Pattern Anal Mach Intell 34(3):417–435CrossRefGoogle Scholar
  5. 5.
    Amal MA, Riadh BA (2011) Survey of nearest neighbor condensing techniques. (IJACSA) Int J Adv Comput Sci Appl 2(11)Google Scholar
  6. 6.
    Chang CL (1974) Finding prototypes for nearest neighbor classifiers. IEEE Trans Comput 23(11):1179–1184CrossRefMATHGoogle Scholar
  7. 7.
    Cervantes A, Galvan IM, Isasi P (2009) AMPSO: a new particle swarm method for nearest neighborhood classification. IEEE Trans Syst Man Cybern Part B Cybern 39(5):1082–1091CrossRefGoogle Scholar
  8. 8.
    Triguero I, Garca S, Herrera F (2010) IPADE: iterative prototype adjustment for nearest neighbor classification. IEEE Trans Neural Netw 21(12):1984–1990CrossRefGoogle Scholar
  9. 9.
    Keller J, Gray MR, Givens JA (1985) A fuzzy k-nearest neighbor algorithm. IEEE Trans Systems Man Cybern SMC-15(4):406–410Google Scholar
  10. 10.
    Blake CL, Merz CJ (1998) UCI repository of machine learning database. Department of Information and Computer Science, University of California, IrvineGoogle Scholar
  11. 11.
    Fayed HA, Atiya AF (2009) A novel template reduction approach for the k-nearest neighbor method. IEEE Trans Neural Netw 20(5):890–896CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  • Zhen-Xing Zhang
    • 1
  • Xue-Wei Tian
    • 2
  • Sang-Hong Lee
    • 2
  • Joon S. Lim
    • 2
  1. 1.School of Information and Electric EngineeringLudong UniversityYanTaiChina
  2. 2.IT CollegeGachon UniversitySeongnamSouth Korea

Personalised recommendations