Correntropy for Random Variables: Properties and Applications in Statistical Inference
Similarity is a key concept to quantify temporal signals or static measurements. Similarity is difficult to define mathematically, however, one never really thinks too much about this difficulty and naturally translates similarity by correlation. This is one more example of how engrained second-order moment descriptors of the probability density function really are in scientific thinking. Successful engineering or pattern recognition solutions from these methodologies rely heavily on the Gaussianity and linearity assumptions, exactly for the same reasons discussed in Chapter 3.
KeywordsInput Space Additive White Gaussian Noise Independent Component Analysis Matched Filter Reproduce Kernel Hilbert Space
- 107.Fukumizu K., Gretton A., Sun X., Scholkopf B., Kernel measures of conditional dependence. In Platt, Koller, Singer, and Roweis Eds., Advances in Neural Information Processing Systems 20, pp. 489–496. MIT Press, Cambridge, MA, 2008.Google Scholar
- 215.McCulloch J., Financial applications of stable distributions. In G. S. Madala and C.R. Rao (Eds.), Handbook of Statistics, vol. 14, pages 393–425. Elsevier, Amsterdam, 1996.Google Scholar
- 231.Nikias C. Shao M., Signal Processing with Alpha-Stable Distributions and Applications. John Wiley and Sons, New York, 1995.Google Scholar
- 261.Rao M., Xu J., Seth S., Chen Y., Tagare M., Principe J., Correntropy dependence measure, submitted to IEEE Trans. Signal Processing.Google Scholar