Weighted Nearest Centroid Neighbourhood

  • Víctor AceñaEmail author
  • Javier M. Moguerza
  • Isaac Martín de Diego
  • Rubén R. Fernández
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11871)


A novel binary classifier based on nearest centroid neighbours is presented. The proposed method uses the well known idea behind the classic k-Nearest Neighbours (k-NN) algorithm: one point is similar to others that are close to it. The new proposal relies on an alternative way of computing neighbourhoods that is better suited to the distribution of data by considering that a more distant neighbour must have less influence than a closer one. The relative importance of any neighbour in a neighbourhood is estimated using the SoftMax function on the implicit distance. Experiments are carried out on both simulated and real data sets. The proposed method outperforms alternatives, providing a promising new research line.


Nearest Neighbours Classification Nearest Centroid Neighbourhood Parameter selection Similarity measure 



Research supported by grant from the Spanish Ministry of Economy and Competitiveness, under the Retos-Colaboración program: SABERMED (Ref: RTC-2017-6253-1); Retos-Investigación program: MODAS-IN (Ref: RTI2018-094269-B-I00); and the support of NVIDIA Corporation with the donation of the Titan V GPU.


  1. 1.
    Biswas, N., Chakraborty, S., Mullick, S.S., Das, S.: A parameter independent fuzzy weighted k-nearest neighbor classifier. Pattern Recogn. Lett. 101, 80–87 (2018)CrossRefGoogle Scholar
  2. 2.
    Chaudhuri, B.: A new definition of neighborhood of a point in multi-dimensional space. Pattern Recogn. Lett. 17(1), 11–17 (1996)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Cover, T.M., Hart, P.E., et al.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefGoogle Scholar
  4. 4.
    Dua, D., Graff, C.: UCI Machine Learning Repository (2017).
  5. 5.
    García, V., Sánchez, J., Marqués, A., Martínez-Peláez, R.: A regression model based on the nearest centroid neighborhood. Pattern Anal. Appl. 21(4), 941–951 (2018)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Ghosh, A.K.: On optimum choice of k in nearest neighbor classification. Comput. Stat. Data Anal. 50(11), 3113–3123 (2006)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification and regression. In: Advances in Neural Information Processing Systems, pp. 409–415 (1996)CrossRefGoogle Scholar
  8. 8.
    Hulett, C., Hall, A., Qu, G.: Dynamic selection of k nearest neighbors in instance-based learning. In: 2012 IEEE 13th International Conference on Information Reuse & Integration (IRI), pp. 85–92. IEEE (2012)Google Scholar
  9. 9.
    Jaiswal, S., Bhadouria, S., Sahoo, A.: KNN model selection using modified cuckoo search algorithm. In: 2015 International Conference on Cognitive Computing and Information Processing (CCIP), pp. 1–5. IEEE (2015)Google Scholar
  10. 10.
    Samworth, R.J., et al.: Optimal weighted nearest neighbour classifiers. Ann. Stat. 40(5), 2733–2763 (2012)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Routledge, New York (2018)CrossRefGoogle Scholar
  12. 12.
    Zhang, S., Cheng, D., Deng, Z., Zong, M., Deng, X.: A novel KNN algorithm with data-driven k parameter computation. Pattern Recogn. Lett. 109, 44–54 (2018)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Víctor Aceña
    • 1
    Email author
  • Javier M. Moguerza
    • 1
  • Isaac Martín de Diego
    • 1
  • Rubén R. Fernández
    • 1
  1. 1.Rey Juan Carlos UniversityMóstolesSpain

Personalised recommendations