Advertisement

Nearest Neighbor Classification by Relearning

  • Naohiro Ishii
  • Yuta Hoki
  • Yuki Okada
  • Yongguang Bao
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5788)

Abstract

Since the k-nearest neighbor (kNN) classification is a simple and effective classification approach, it is well known in the data classification. However, improving performance of the classifier is still attractive to cope with the high accuracy processing. A tolerant rough set is considered as a basis of the classification of data. The data classification is realized by applying the kNN with distance function. To improve the classification accuracy, a distance function with weights is considered. Then, weights of the function are optimized by the genetic algorithm. After the learning of training data, an unknown data is classified by the kNN with distance function. To improve further the performance of the kNN classifier, a relearning method is proposed. The proposed relearning method shows a higher generalization accuracy when compared to the basic kNN with distance function and other conventional learning algorithms. Experiments have been conducted on some benchmark datasets from the UCI Machine Learning Repository.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bay, S.D.: Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets. Intelligent Data Analysis 3(3), 191–209 (1999)CrossRefGoogle Scholar
  2. 2.
    Bao, Y., Ishii, N.: Combining Multiple k-Nearest Neighbor Classifiers for Text Classification by Reducts. In: Lange, S., Satoh, K., Smith, C.H. (eds.) DS 2002. LNCS (LNAI), vol. 2534, pp. 340–347. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  3. 3.
    Ishii, N., Muai, T., Yamada, T., Bao, Y.: Classification by Weighting, Similarity and kNN. In: Corchado, E., Yin, H., Botti, V., Fyfe, C. (eds.) IDEAL 2006. LNCS, vol. 4224, pp. 57–64. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  4. 4.
    Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern Classification. IEEE Transactions on Information Theory 13(1), 21–27 (1967)CrossRefzbMATHGoogle Scholar
  5. 5.
    Wilson, D.R., Martinez, T.R.: An Integrated Instance-based Learning Algorithm. Computational Intelligence 16(1), 1–28 (2000)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Merz, C.J., Murphy, P.M.: UCI Repository of Machine Learning Databases. Irvine, CA (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Naohiro Ishii
    • 1
  • Yuta Hoki
    • 1
  • Yuki Okada
    • 1
  • Yongguang Bao
    • 1
  1. 1.Aichi Institute of TechnologyToyotaJapan

Personalised recommendations