Deep Convolutional Neural Networks and Maximum-Likelihood Principle in Approximate Nearest Neighbor Search

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10255)

Abstract

Deep convolutional neural networks are widely used to extract high-dimensional features in various image recognition tasks. If the count of classes is relatively large, performance of the classifier for such features can be insufficient to be implemented in real-time applications, e.g., in video-based recognition. In this paper we propose the novel approximate nearest neighbor algorithm, which sequentially chooses the next instance from the database, which corresponds to the maximal likelihood (joint density) of distances to previously checked instances. The Gaussian approximation of the distribution of dissimilarity measure is used to estimate this likelihood. Experimental study results in face identification with LFW and YTF datasets are presented. It is shown that the proposed algorithm is much faster than an exhaustive search and several known approximate nearest neighbor methods.

Keywords

Statistical pattern recognition Approximate nearest neighbor Image recognition Deep learning Convolutional neural networks 

References

  1. 1.
    Amato, G., Falchi, F., Gennaro, C., Vadicamo, L.: Deep permutations: deep convolutional neural networks and permutation-based indexing. In: Amsaleg, L., Houle, M.E., Schubert, E. (eds.) SISAP 2016. LNCS, vol. 9939, pp. 93–106. Springer, Cham (2016). doi:10.1007/978-3-319-46759-7_7 CrossRefGoogle Scholar
  2. 2.
    Parkhi, O.M., Vedaldi, A., Zisserman, A.: Deep face recognition. In: Proceedings of the British Machine Vision, pp. 6–17 (2015)Google Scholar
  3. 3.
    Wu, X., He, R., Sun, Z.: A lightened CNN for deep face representation. arXiv preprint arXiv:1511.02683 (2015)
  4. 4.
    Savchenko, A.V.: Search Techniques in Intelligent Classification Systems. Springer International Publishing, Heidelberg (2016)CrossRefMATHGoogle Scholar
  5. 5.
    Muja, M., Lowe, D.G.: Scalable nearest neighbor algorithms for high dimensional data. IEEE Trans. Pattern Anal. Mach. Intell. 36(11), 2227–2240 (2014)CrossRefGoogle Scholar
  6. 6.
    Boytsov, L., Naidan, B.: Engineering efficient and effective non-metric space library. In: Brisaboa, N., Pedreira, O., Zezula, P. (eds.) SISAP 2013. LNCS, vol. 8199, pp. 280–293. Springer, Heidelberg (2013). doi:10.1007/978-3-642-41062-8_28 CrossRefGoogle Scholar
  7. 7.
    Silpa-Anan, C., Hartley, R.: Optimised KD-trees for fast image descriptor matching. In: IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1–8 (2008)Google Scholar
  8. 8.
    He, J., Kumar, S., Chang, S.: On the difficulty of nearest neighbor search. In: 29th International Conference on Machine Learning (ICML-2012), pp. 1127–1134 (2012)Google Scholar
  9. 9.
    Gonzalez, E.C., Figueroa, K., Navarro, G.: Effective proximity retrieval by ordering permutations. IEEE Trans. PAMI 30(9), 1647–1658 (2008)CrossRefGoogle Scholar
  10. 10.
    Savchenko, A.V.: Maximum-likelihood approximate nearest neighbor method in real-time image recognition. Pattern Recogn. 61, 459–469 (2017)CrossRefGoogle Scholar
  11. 11.
    Burghouts, G., Smeulders, A., Geusebroek, J.-M.: The distribution family of similarity distances. In: Advances in Neural Information Processing Systems, pp. 201–208 (2008)Google Scholar
  12. 12.
    P’kalska, E., Duin, R.P.: Classifiers for dissimilarity-based pattern recognition. In: Proceedings of the 15th IEEE International Conference on Pattern Recognition (ICPR), pp. 12–16 (2000)Google Scholar
  13. 13.
    Savchenko, A.V.: Clustering and maximum likelihood search for efficient statistical classification with medium-sized databases. Optim. Lett. 11(2), 329–341 (2017)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Laboratory of Algorithms and Technologies for Network AnalysisNational Research University Higher School of EconomicsNizhny NovgorodRussia

Personalised recommendations