Abstract
The k-nearest neighbor (KNN) classification is a simple and effective classification approach. However, improving performance of the classifier is still attractive. Combining multiple classifiers is an effective technique for improving accuracy. There are many general combining algorithms, such as Bagging, Boosting, or Error Correcting Output Coding that significantly improve the classifier such as decision trees, rule learners, or neural networks. Unfortunately, these combining methods do not improve the nearest neighbor classifiers. In this paper we present a new approach to combine multiple KNN classifiers based on different distance funtions, in which we apply multiple distance functions to improve the performance of the k-nearest neighbor classifier. The proposed algorithm seeks to increase generalization accuracy when compared to the basic k-nearest neighbor algorithm. Experiments have been conducted on some benchmark datasets from the UCI Machine Learning Repository. The results show that the proposed algorithm improves the performance of the k-nearest neighbor classification.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bay, S.D.: Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets. Intelligent Data Analysis 3(3), 191–209 (1999)
Bao, Y., Du, X., Ishii, N.: Combining Feature Selection with Feature Weighting for k-NN Classifier. In: Yin, H., Allinson, N.M., Freeman, R., Keane, J.A., Hubbard, S. (eds.) IDEAL 2002. LNCS, vol. 2412, pp. 461–468. Springer, Heidelberg (2002)
Bao, Y., Ishii, N.: Combining multiple k-Nearest Neighbor Classifiers for Text Classification by Reducts. In: Lange, S., Satoh, K., Smith, C.H. (eds.) DS 2002. LNCS, vol. 2534, pp. 361–368. Springer, Heidelberg (2002)
Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern classification. IEEE Transactions on Information Theory 13(1), 21–27 (1967)
Itqon, Kaneko, S., Igarashi, S.: Combining Multiple k-Nearest Neighbor Classifiers Using Feature Combinations. Journal IECI 2(3), 23–319 (2000)
Merz, C.J., Murphy, P.M.: UCI Repository of Machine Learning Databases, Irvine, CA: University of California Irvine, Department of Information and Computer Science (1998), Internet: http://www.ics.uci.edu/mlearn/MLRepository.html
Stanfill, C., Waltz, D.: Toward memory-based reasoning. Communications of the ACM 29, 1213–1228 (1986)
Tapia, R.A., Thompson, J.R.: Nonparametric Probability Density Estimation. The Johns Hopkins University Press, Baltimore (1978)
Wilson, D.R., Martinez, T.R.: Improved Heterogeneous Distance Functions. Journal of Artificial Intelligence Research 6(1), 1–34 (1997)
Wilson, D.R., Martinez, T.R.: An Integrated Instance-Based Learning Algorithm. Computational Intelligence 16(1), 1–28 (2000)
Wilson, D.R., Martinez, T.R.: Reduction Techniques for Instance-Based Learning Algorithms. Machine Learning 38(3), 257–280 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Bao, Y., Ishii, N., Du, X. (2004). Combining Multiple k-Nearest Neighbor Classifiers Using Different Distance Functions. In: Yang, Z.R., Yin, H., Everson, R.M. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2004. IDEAL 2004. Lecture Notes in Computer Science, vol 3177. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28651-6_93
Download citation
DOI: https://doi.org/10.1007/978-3-540-28651-6_93
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22881-3
Online ISBN: 978-3-540-28651-6
eBook Packages: Springer Book Archive