Skip to main content

Relaxed Exponential Kernels for Unsupervised Learning

  • Conference paper
Pattern Recognition (DAGM 2011)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 6835))

Included in the following conference series:

Abstract

Many unsupervised learning algorithms make use of kernels that rely on the Euclidean distance between two samples. However, the Euclidean distance is optimal for Gaussian distributed data. In this paper, we relax the global Gaussian assumption made by the Euclidean distance, and propose a locale Gaussian modelling for the immediate neighbourhood of the samples, resulting in an augmented data space formed by the parameters of the local Gaussians. To this end, we propose a convolution kernel for the augmented data space. The factorisable nature of this kernel allows us to introduce (semi)-metrics for this space, which further derives relaxed versions of known kernels for this space. We present empirical results to validate the utility of the proposed localized approach in the context of spectral clustering. The key result of this paper is that this approach that combines the local Gaussian model with measures that adhere to metric properties, yields much better performance in different spectral clustering tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ali, S.M., Silvey, S.D.: A general class of coefficients of divergence of one distribution from another. J. of the Royal Statistical Society. Series B 28(1), 131–142 (1966)

    MathSciNet  MATH  Google Scholar 

  2. Atkinson, C., Mitchell, A.F.S.: Rao’s distance measure. The Indian J. of Statistics, Series A 43(3), 345–365 (1945)

    MathSciNet  Google Scholar 

  3. CsiszĂ¡r, I.: Information–type measures of difference of probability distributions and indirect observations. Studia Scientiarium Mathematicarum Hungarica 2, 299–318 (1967)

    MATH  Google Scholar 

  4. Cuturi, M., Fukumizu, K., Vert, J.–P.: Semigroup kernels on measures. JMLR 6, 1169–1198 (2005)

    MathSciNet  MATH  Google Scholar 

  5. Förstner, W., Moonen, B.: A metric for covariance matrices. Tech. rep., Dept. of Geodesy and Geo–Informatics, Stuttgart University (1999)

    Google Scholar 

  6. Hein, M., bousquet, O.: Hilbertian metrics and positive definite kernels on probability measures. In: Proc. of AISTATS, pp. 136–143 (2005)

    Google Scholar 

  7. Kailath, T.: The divergence and Bhattacharyya distance measures in signal selection. IEEE Trans. on Communication Technology 15(1), 52–60 (1967)

    Article  Google Scholar 

  8. Kondor, R., Jebara, T.: A kernel between sets of vectors. In: ACM Proc. of ICML 2003 (2003)

    Google Scholar 

  9. Kullback, S.: Information Theory and Statistics – Dover Edition. Dover, New York (1997)

    Google Scholar 

  10. Luxburg, U.v.: A tutotrial on spectral clustering. Tech. Rep. TR–149, Max Plank Institute for Biological Cybernetics (2006)

    Google Scholar 

  11. Martins, A., Smith, N., Xing, E., Aguiar, P., Figueiredo, M.: Nonextensive information theoretic kernels on measures. JMLR 10, 935–975 (2009)

    MathSciNet  Google Scholar 

  12. Moreno, P., Ho, P., Vasconcelos, N.: A Kullback–Leibler divergence based kernel for svm classification in multimedia applications. In: NIPS, vol. 16 (2003)

    Google Scholar 

  13. Newman, D., Hettich, S., Blake, C., Merz, C.: UCI Repository of Machine Learning Databases (1998), www.ics.uci.edu/~mlearn/MLRepository.html

  14. Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: Analysis and an algorithm. In: NIPS, vol. 14, pp. 849–856. MIT Press, Cambridge (2002)

    Google Scholar 

  15. Rao, C.R.: Information and the accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc. (58), 326–337 (1945)

    Google Scholar 

  16. Schoenberg, I.: Metric spaces and positive definite functions. Trans. of the American Mathematical Society 44(3), 522–536 (1938)

    Article  MathSciNet  Google Scholar 

  17. Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)

    Book  Google Scholar 

  18. Vincent, P., Bengio, Y.: Manifold Parzen windows. In: NIPS, vol. 15, pp. 825–832. MIT Press, Cambridge (2003)

    Google Scholar 

  19. Wolf, L., Shashua, A.: Learning over sets using kernel principal angles. JMLR 4, 913–931 (2003)

    MathSciNet  Google Scholar 

  20. Zha, H., Ding, C., Gu, M., He, X., Simon, H.: Spectral relaxation for k–means clustering. In: NIPS, vol. 13. MIT Press, Cambridge (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abou-Moustafa, K., Shah, M., De La Torre, F., Ferrie, F. (2011). Relaxed Exponential Kernels for Unsupervised Learning. In: Mester, R., Felsberg, M. (eds) Pattern Recognition. DAGM 2011. Lecture Notes in Computer Science, vol 6835. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23123-0_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-23123-0_19

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-23122-3

  • Online ISBN: 978-3-642-23123-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics