Abstract
The effectiveness of the k-NN classifier is highly dependent on the value of the parameter k that is chosen in advance and is fixed during classification. Different values are appropriate for different datasets and parameter tuning is usually inevitable. A dataset may include simultaneously well-separated and not well-separated classes as well as noise in certain regions of the metric space. Thus, a different k value should be employed depending on the region where the unclassified instance lies. The paper proposes a new algorithm with five heuristics for dynamic k determination. The heuristics are based on a fast clustering pre-processing procedure that builds an auxiliary data structure. The latter provides information about the region where the unclassified instance lies. The heuristics exploit the information and dynamically determine how many neighbours will be examined. The data structure construction and the heuristics do not involve any input parameters. The proposed heuristics are tested on several datasets. The experimental results illustrate that in many cases they can achieve higher classification accuracy than the k-NN classifier that uses the best tuned k value.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alcalá-Fdez, J., Fernández, A., Luengo, J., Derrac, J., GarcÃa, S.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. Mult. Valued Log. Soft Comput. 17(2–3), 255–287 (2011)
Bhattacharya, G., Ghosh, K., Chowdhury, A.S.: Test point specific k estimation for kNN classifier. In: Proceedings of the 2014 22nd International Conference on Pattern Recognition, ICPR 2014, USA, pp. 1478–1483. IEEE Computer Society (2014). https://doi.org/10.1109/ICPR.2014.263
Bulut, F., Amasyali, M.F.: Locally adaptive k parameter selection for nearest neighbor classifier: one nearest cluster. Pattern Anal. Appl. 20(2), 415–425 (2015). https://doi.org/10.1007/s10044-015-0504-0
Dasarathy, B.V.: Nearest neighbor. NN pattern classification techniques. IEEE Computer Society Press, NN norms (1991)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley, Hoboken (2000)
Johansson, U., Boström, H., König, R.: Extending nearest neighbor classification with spheres of confidence. In: Wilson, D., Lane, H.C. (eds.) Proceedings of the Twenty-First International Florida Artificial Intelligence Research Society Conference, Coconut Grove, Florida, USA, 15–17 May 2008, pp. 282–287. AAAI Press (2008). http://www.aaai.org/Library/FLAIRS/2008/flairs08-070.php
Mullick, S.S., Datta, S., Das, S.: Adaptive learning-based \(k\) -nearest neighbor classifiers with resilience to class imbalance. IEEE Trans. Neural Networks Learn. Syst. 29(11), 5713–5725 (2018). https://doi.org/10.1109/TNNLS.2018.2812279
Ougiaroglou, S., Evangelidis, G.: RHC: a non-parametric cluster-based data reduction for efficient \(k\)-NN classification. Pattern Anal. Appl. 19(1), 93–109 (2014). https://doi.org/10.1007/s10044-014-0393-7
Ougiaroglou, S., Nanopoulos, A., Papadopoulos, A.N., Manolopoulos, Y., Welzer-Druzovec, T.: Adaptive k-nearest-neighbor classification using a dynamic number of nearest neighbors. In: Ioannidis, Y., Novikov, B., Rachev, B. (eds.) ADBIS 2007. LNCS, vol. 4690, pp. 66–82. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-75185-4_7
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Ougiaroglou, S., Evangelidis, G., Diamantaras, K.I. (2020). Dynamic k-NN Classification Based on Region Homogeneity. In: Darmont, J., Novikov, B., Wrembel, R. (eds) New Trends in Databases and Information Systems. ADBIS 2020. Communications in Computer and Information Science, vol 1259. Springer, Cham. https://doi.org/10.1007/978-3-030-54623-6_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-54623-6_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-54622-9
Online ISBN: 978-3-030-54623-6
eBook Packages: Computer ScienceComputer Science (R0)