Comparative Study of Distance Functions for Nearest Neighbors

Conference paper

DOI: 10.1007/978-90-481-3660-5_14

Cite this paper as:
Walters-Williams J., Li Y. (2010) Comparative Study of Distance Functions for Nearest Neighbors. In: Elleithy K. (eds) Advanced Techniques in Computing Sciences and Software Engineering. Springer, Dordrecht

Abstract

Many learning algorithms rely on distance metrics to receive their input data. Research has shown that these metrics can improve the performance of these algorithms. Over the years an often popular function is the Euclidean function. In this paper, we investigate a number of different metrics proposed by different communities, including Mahalanobis, Euclidean, Kullback-Leibler and Hamming distance. Overall, the best-performing method is the Mahalanobis distance metric.

Keywords

Kullback-Leibler distance Euclidean distance Mahalanobis distance Manhattan distance Hamming distance Minkowski distance Nearest Neighbor 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  1. 1.School of Computing and Information TechnologyUniversity of TechnologyKingston 6Jamaica W.I
  2. 2.Department of Mathematics and Computing Centre for Systems BiologyUniversity of Southern QueenslandToowoombaAustralia

Personalised recommendations