Correntropy for Random Variables: Properties and Applications in Statistical Inference

  • Weifeng Liu
  • Puskal Pokharel
  • Jianwu Xu
  • Sohan Seth
Part of the Information Science and Statistics book series (ISS)


Similarity is a key concept to quantify temporal signals or static measurements. Similarity is difficult to define mathematically, however, one never really thinks too much about this difficulty and naturally translates similarity by correlation. This is one more example of how engrained second-order moment descriptors of the probability density function really are in scientific thinking. Successful engineering or pattern recognition solutions from these methodologies rely heavily on the Gaussianity and linearity assumptions, exactly for the same reasons discussed in Chapter 3.


Input Space Additive White Gaussian Noise Independent Component Analysis Matched Filter Reproduce Kernel Hilbert Space 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 7.
    Aronszajn N., The theory of reproducing kernels and their applications, Cambridge Philos. Soc. Proc., vol. 39:133–153, 1943.CrossRefMathSciNetGoogle Scholar
  2. 12.
    Bach F., Jordan M., Kernel independent component analysis, J. Mach. Learn. Res., 3:1–48, 2002.MathSciNetGoogle Scholar
  3. 65.
    Cover T., Thomas J., Elements of Information Theory, Wiley, New York, 1991CrossRefMATHGoogle Scholar
  4. 94.
    Erdogmus D., Agrawal R., Principe J., A mutual information extension to the matched filter, Signal Process., 85(5):927–935, May 2005.CrossRefMATHGoogle Scholar
  5. 107.
    Fukumizu K., Gretton A., Sun X., Scholkopf B., Kernel measures of conditional dependence. In Platt, Koller, Singer, and Roweis Eds., Advances in Neural Information Processing Systems 20, pp. 489–496. MIT Press, Cambridge, MA, 2008.Google Scholar
  6. 119.
    Granger C., Maasoumi E., and Racine J., A dependence metric for possibly nonlinear processes. J. Time Series Anal., 25(5):649–669, 2004.CrossRefMATHMathSciNetGoogle Scholar
  7. 123.
    Gretton, A., Herbrich R., Smola A., Bousquet O., Schölkopf B., Kernel Methods for Measuring Independence,” J. Mach. Learn. Res., 6:2075–2129, 2005.MATHMathSciNetGoogle Scholar
  8. 169.
    Joe H., Relative entropy measures of multivariate dependence. J. Amer. Statist. Assoc., 84(405):157–164, 1989.CrossRefMATHMathSciNetGoogle Scholar
  9. 201.
    Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.CrossRefMathSciNetGoogle Scholar
  10. 212.
    Mari D., Kotz S., Correlation and Dependence. Imperial College Press, London, 2001.CrossRefMATHGoogle Scholar
  11. 215.
    McCulloch J., Financial applications of stable distributions. In G. S. Madala and C.R. Rao (Eds.), Handbook of Statistics, vol. 14, pages 393–425. Elsevier, Amsterdam, 1996.Google Scholar
  12. 217.
    Mercer J., Functions of positive and negative type, and their connection with the theory of integral equations, Philosoph. Trans. Roy. Soc. Lond., 209:415–446, 1909.CrossRefMATHGoogle Scholar
  13. 218.
    Micheas A. and Zografos K., Measuring stochastic dependence using ’-divergence. J. Multivar. Anal., 97:765–784, 2006.CrossRefMATHMathSciNetGoogle Scholar
  14. 231.
    Nikias C. Shao M., Signal Processing with Alpha-Stable Distributions and Applications. John Wiley and Sons, New York, 1995.Google Scholar
  15. 241.
    Parzen E., On the estimation of a probability density function and the mode, Ann. Math. Statist., 33:1065–1067, 1962.CrossRefMATHMathSciNetGoogle Scholar
  16. 249.
    Pokharel P., Liu W., Principe J., A low complexity robust detector in impulsive noise, Signal Process., 89(10):1902–1909, 2009.CrossRefMATHGoogle Scholar
  17. 250.
    Prichard, D., Theiler, J. (1994). Generating surrogate data for time series with several simultaneously measured variables. Phys. Rev. Lett., 73(7):951–954.CrossRefGoogle Scholar
  18. 261.
    Rao M., Xu J., Seth S., Chen Y., Tagare M., Principe J., Correntropy dependence measure, submitted to IEEE Trans. Signal Processing.Google Scholar
  19. 262.
    Renyi A., On measures of dependence. Acta Mathematica Academiae Scientiarum Hungaricae, 10:441–451, 1959.CrossRefMATHMathSciNetGoogle Scholar
  20. 282.
    Santamaría I., Pokharel P., Principe J, Generalized correlation function: Definition, properties and application to blind equalization, IEEE Trans. Signal Process., 54(6):2187–2197, 2006.CrossRefGoogle Scholar
  21. 291.
    Schreiber, T., Schmitz, A. (2000). Surrogate time series. Physica D, 142:346–382.CrossRefMATHMathSciNetGoogle Scholar
  22. 300.
    Silverman B., Density Estimation for Statistics and Data Analysis, Chapman and Hall, London, 1986.CrossRefMATHGoogle Scholar
  23. 301.
    Silvey S., On a measure of association. Ann Math Statist., 35(3): 1157–1166, 1964.CrossRefMATHMathSciNetGoogle Scholar
  24. 344.
    Xu J., Bakardjian H., Cichocki A., Principe J., A new nonlinear similarity measure for multichannel signals, Neural Netw. (invited paper), 21(2–3):222–231, 2008.CrossRefMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Weifeng Liu
    • 1
  • Puskal Pokharel
    • 1
  • Jianwu Xu
    • 1
  • Sohan Seth
    • 1
  1. 1.Dept. Electrical Engineering & Biomedical EngineeringUniversity of FloridaGainesvilleUSA

Personalised recommendations