Local Fisher Discriminant Component Hashing for Fast Nearest Neighbor Classification

  • Tomoyuki Shibata
  • Osamu Yamaguchi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5342)

Abstract

This paper presents a novel approximate nearest neighbor classification scheme, Local Fisher Discriminant Component Hashing (LFDCH). Nearest neighbor (NN) classification is a popular technique in the field of pattern recognition but has poor classification speed particularly in high-dimensional space. To achieve fast NN classification, Principal Component Hashing (PCH) has been proposed, which searches the NN patterns in low-dimensional eigenspace using a hash algorithm. It is, however, difficult to achieve accuracy and computational efficiency simultaneously because the eigenspace is not necessarily the optimal subspace for classification. Our scheme, LFDCH, introduces Local Fisher Discriminant Analysis (LFDA) for constructing a discriminative subspace for achieving both accuracy and computational efficiency in NN classification. Through experiments, we confirmed that LFDCH achieved faster and more accurate classification than classification methods using PCH or ordinary NN.

Keywords

approximate nearest neighbor classification high- dimensional space hash dimensionality reduction 

References

  1. 1.
    Cover, B.K.B.T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Transactions on Information Theory IT-13(1), 21–27 (1967)CrossRefMATHGoogle Scholar
  2. 2.
    Arya, S., Mount, D.M., Netanyahu, N.S., Silverman, R., Wu, A.Y.: An optimal algorithm for approximate nearest neighbor searching. Journal of the ACM 45, 891–923 (1998)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Matsushita, Y., Wada, T.: Principal Component Hashing. Meeting on Image Recognition and Understanding (MIRU), pp. 127–134 (July 2007) (in Japanese)Google Scholar
  4. 4.
    Hotelling, H.: Analysis of a complex of statistical variables into principal components. Journal of the American Statistical Association 24, 441 (1933)MATHGoogle Scholar
  5. 5.
    Indyk, P., Motwani, R.: Approximate Nearest Neighbors: Towards Removing the Curse of Dimensionality. In: Proceedings of the 30th ACM Symposium on Theory of Computing (STOCf 1998), pp. 604–613 (May 1998)Google Scholar
  6. 6.
    Datar, M., Indyk, P., Immorlica, N., Mirrokni, V.: Locality-Sensitive Hashing Scheme Based on p-StableDistributions. In: Proceedings of the 20th Annual Symposium on Computational Geometry (SCG 2004) (June 2004)Google Scholar
  7. 7.
    Sugiyama, M.: Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis. JMLR 8, 1027–1061 (2007)MATHGoogle Scholar
  8. 8.
    Fisher, R.A.: The Use of Multiple Measurements in Taxonomic Problems. Annals of Eugenics 7, Part II, 179–188 (1936)CrossRefGoogle Scholar
  9. 9.
    Kozakaya, T., Yamaguchi, O.: Face Recognition by Projection-based 3D Normalization and Shading Subspace Orthogonalization. FGR, pp. 163–168 (2006)Google Scholar
  10. 10.
    Vidal, R.: An algorithm for finding nearest neighbor in (approximately) constant average time. Pattern Recognition Letters 4, 145–158 (1986)CrossRefGoogle Scholar
  11. 11.
    Mico, L., Oncina, J., Vidal, E.: A new version of the nearest-neighbor approximating and eliminating search algorithm (AESA) with linear preprocessing time and memory requirements. Pattern Recognition Letters 15, 9–17 (1994)CrossRefGoogle Scholar
  12. 12.
    Brin, S.: Near neighbor search in large metric spaces. In: Proc. of 21st Conf. on very large database (VLDB), Zurich, Switzerland, pp. 574–584 (1995)Google Scholar
  13. 13.
    Yianilos, P.Y.: Data structures and algorithms for nearest neighbor search in general metric spaces. In: Proc. of the Fourth Annual ACM-SIAM Symp. on Discrete Algorithms, Austin, TX, pp. 311–321 (1993)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Tomoyuki Shibata
    • 1
  • Osamu Yamaguchi
    • 1
  1. 1.Corporate Research and Development CenterToshiba CorporationKawasakiJapan

Personalised recommendations