Modified Reducts and Their Processing for Nearest Neighbor Classification

  • Naohiro Ishii
  • Ippei Torii
  • Yongguang Bao
  • Hidekazu Tanaka
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7435)


Dimensional reduction of data is still important as in the data processing and on the web to represent and manipulate higher dimensional data. Rough set concept developed is fundamental and useful to process higher dimensional data. Reduct in the rough set is a minimal subset of features, which has almost the same discernible power as the entire features in the higher dimensional scheme. But, we have problems of the application of reducts for the classification. Here, we develop a method which connects reducts and the nearest neighbor method to classify data with higher accuracy. To improve the classification ability of reducts, we propose a new modified reduct based on reducts and its optimization method for the classification with higher accuracy. Then, it is shown that the modified reduct improves the classification accuracy, which is followed by the optimized nearest neighbor classification.


Classification Accuracy High Dimensional Data Near Neighbor Decision Table Neighbor Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Pawlak, Z.: Rough Sets. International Journal of Computer and Information Science 11, 341–356 (1982)MathSciNetzbMATHCrossRefGoogle Scholar
  2. 2.
    Pawlak, Z., Slowinski, R.: Rough Set Approach to Multi-attribute Decision Analysis. European Journal of Operations Research 72, 443–459 (1994)zbMATHCrossRefGoogle Scholar
  3. 3.
    Skowron, A., Rauszer, C.: The Discernibility Matrices and Functions in Information Systems. In: Intelligent Decision Support- Handbook of Application and Advances of Rough Sets Theory, pp. 331–362. Kluwer Academic Publishers, Dordrecht (1992)Google Scholar
  4. 4.
    Skowron, A., Polkowski, L.: Decision Algorithms, A Survey of Rough Set Theoretic Methods. Fundamenta Informatica 30(3-4), 345–358 (1997)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern Classification. IEEE Transactions on Information Theory 13(1), 21–27 (1967)zbMATHCrossRefGoogle Scholar
  6. 6.
    Ishii, N., Morioka, Y., Bao, Y., Tanaka, H.: Control of Variables in Reducts - kNN Classification with Confidence. In: König, A., Dengel, A., Hinkelmann, K., Kise, K., Howlett, R.J., Jain, L.C. (eds.) KES 2011, Part IV. LNCS, vol. 6884, pp. 98–107. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  7. 7.
    Bao, Y., Tsuchiya, E., Ishii, N., Du, X.-Y.: Classification by Instance-Based Learning Algorithm. In: Gallagher, M., Hogan, J.P., Maire, F. (eds.) IDEAL 2005. LNCS, vol. 3578, pp. 133–140. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  8. 8.
    Momin, B.F., Mitra, S., Gupta, R.D.: Reduct Generation and Classification of Gene Expression Data. In: Proc. International Conference on Hybrid Information Technology, ICHIT 2006, vol. 1, pp. 699–708. IEEE Computer Society (2006)Google Scholar
  9. 9.
    Cheetham, W., Price, J.: Measures of Solution Accuracy in Case-Based Reasoning Systems. In: Funk, P., González Calero, P.A. (eds.) ECCBR 2004. LNCS (LNAI), vol. 3155, pp. 106–118. Springer, Heidelberg (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Naohiro Ishii
    • 1
  • Ippei Torii
    • 1
  • Yongguang Bao
    • 2
  • Hidekazu Tanaka
    • 3
  1. 1.Aichi Institute of TechnologyYakusachoJapan
  2. 2.Aichi Information SystemKariyaJapan
  3. 3.Daido UniversityMinamikuJapan

Personalised recommendations