Skip to main content

Correntropy for Random Processes: Properties and Applications in Signal Processing

  • Chapter
  • First Online:
Book cover Information Theoretic Learning

Part of the book series: Information Science and Statistics ((ISS))

  • 3758 Accesses

Abstract

The previous chapter defined cross-correntropy for the case of a pair of scalar random variables, and presented applications in statistical inference. This chapter extends the definition of correntropy for the case of random (or stochastic) processes, which are index sets of random variables. In statistical signal processing the index set is time; we are interested in random variables that are a function of time and the goal is to quantify their statistical dependencies (although the index set can also be defined over inputs or channels of multivariate random variables). The autocorrelation function, which measures the statistical dependency between random variables at two different times, is conventionally utilized for this goal. Hence, we generalize the definition of autocorrelation to an autocorrentropy function. The name correntropywas coined to reflect the fact that the function “looks like” correlation but the sum over the lags (or over dimensions of the multivariate random variable) is the information potential (i.e., the argument of Renyi’s quadratic entropy). The definition of cross-correntropy for random variables carries over to time series with a minor but important change in the domain of the variables that now are an index set of lags. When it is clear from the context, we simplify the terminology and refer to the different functions (autocorrentropy, or crosscorrentropy) simply as correntropy function, but keep the word “function” to distinguish them from Chapter 10 quantities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Amari S., Cichocki A., Yang H., A new learning algorithm for blind signal separation. Advances in Neural Information Processing Systems, vol. 8 pp. 757–763, MIT Press, Cambridge, MA, 1996.

    Google Scholar 

  2. Aronszajn N., Theory of reproducing kernels, Trans. of the Amer. Math. Soc., 68(3):337–404, 1950.

    Article  MATH  MathSciNet  Google Scholar 

  3. Bagshaw P., Paul Bagshaw’s database for evaluating pitch determination algorithms. Available online at http://www.cstr.ed.ac.uk/research/projects/fda.

  4. Beadle E., Schroeder J., Moran B., An overview of Renyi’s entropy and some potential applications, 42 nd Asilomar Conference on Signals, Systems and Computers, October 2008.

    Google Scholar 

  5. Belouchrani A., Abed-Meraim K., Cardoso J., Moulines E., A blind source separation technique using second-order statistics, IEEE Trans. Signal Process. 45(2):434–444, 1997.

    Article  Google Scholar 

  6. Bercher J., Vignat C., Estimating the Entropy of a Signal with Applications, IEEE Trans. Signal Process., 48(6):1687–1694, 2000.

    Article  Google Scholar 

  7. Brown J., Puckette M., Calculation of a “narrowed” autocorrelation function, J. Acoust. Soc. Am., 85:1595–1601, 1989.

    Article  Google Scholar 

  8. Cardoso J., Souloumiac A., Blind beamforming for non-Gaussian signals, Radar Signal Process., IEE Proc. F, 140(6):362–370, December 1993.

    Article  Google Scholar 

  9. Comon P., Independent component analysis, a new concept?, Signal Process., 36(3):287–314, 1994.

    Article  MATH  Google Scholar 

  10. Gretton, A., Herbrich R., Smola A., Bousquet O., Schölkopf B., Kernel Methods for Measuring Independence,” J. Mach. Learn. Res., 6:2075–2129, 2005.

    MATH  MathSciNet  Google Scholar 

  11. Hardoon D., Szedmak S., Shawe-Taylor J., Canonical correlation analysis: an overview with application to learning methods, Neur. Comput., 16(12):2664–2699, Dec. 2004.

    Article  Google Scholar 

  12. Hess W., Pitch Determination of Speech Signals. Springer, New York, 1993.

    Google Scholar 

  13. Hild II K., Erdogmus D., Principe J., An analysis of entropy estimators for blind source separation, Signal Process., 86(1):182–194, 2006.

    Article  MATH  Google Scholar 

  14. Hyvarinen A., Fast and Robust Fixed-Point Algorithms for Independent Component Analysis, IEEE Trans. Neural Netw., 10(3):626–634, 1999.

    Article  Google Scholar 

  15. Jeong K.H., Liu W., Principe J., The correntropy MACE filter, Pattern Recogn., 42(5):871–885, 2009.

    Article  MATH  Google Scholar 

  16. Jeong K.W., Principe J., Enhancing the correntropy MACE filter with random projections, Neurocomputing, 72(1–3):102–111, 2008.

    Article  Google Scholar 

  17. Jutten C., Herault J., Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture. Signal Process., 24:1–10, 1991.

    Article  MATH  Google Scholar 

  18. Kumar B., Minimum variance synthetic discriminant functions, J. Opt. Soc. Am., A3(10):1579–1584, 1986.

    Article  Google Scholar 

  19. Kumar B., Tutorial survey of composite filter designs for optical correlators, Appl. Opt., 31:4773–4801, 1992.

    Article  Google Scholar 

  20. Kumar B., Savvides M., Xie C., Venkataramani K., Biometric verification with correlation filters, Appl. Opt., 43(2):391–402, 2004.

    Article  Google Scholar 

  21. Li R., Liu W., Principe J., A unifying criterion for blind source separation based on correntropy, Signal Process., Special Issue on ICA, 8(78):1872–1881.

    Google Scholar 

  22. Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.

    Article  MathSciNet  Google Scholar 

  23. Loève, M.M., Probability Theory, VanNostrand, Princeton, NJ, 1955.

    MATH  Google Scholar 

  24. Mahalanobis A., Kumar B., Casasent D., Minimum average correlation energy filters, Appl. Opt., 26(17):3633–3640, 1987.

    Article  Google Scholar 

  25. Mahalanobis A., Forman A., Bower M., Cherry R., Day N., Multi-class SAR ATR using shift invariant correlation filters, Pattern Recogn., 27:619–626 Special Issue on Correlation Filters and Neural Networks, 1994.

    Google Scholar 

  26. Mardia K., Jupp P., Directional Statistics, Wiley, New York, 2000.

    MATH  Google Scholar 

  27. Mercer J., Functions of positive and negative type, and their connection with the theory of integral equations, Philosoph. Trans. Roy. Soc. Lond., 209:415–446, 1909.

    Article  MATH  Google Scholar 

  28. Mika S., Ratsch G., Weston J., Scholkopf B., Muller K., Fisher discriminant analysis with kernels. In Proceedings of IEEE International Workshop on Neural Networks for Signal Processing, pages 41–48, Madison, USA, August 23–25, 1999.

    Google Scholar 

  29. Papoulis A., Probability, Random Variables and Stochastic Processes, McGraw-Hill, New York, 1965.

    MATH  Google Scholar 

  30. Parzen E., Statistical inference on time series by Hilbert space methods, Tech. Report 23, Stat. Dept., Stanford Univ., 1959.

    Google Scholar 

  31. Patterson R., Holdsworth J., Nimmo-Smith I., Rice P., SVOS final report, Part B: Implementing a gammatone filterbank, Appl. Psychol. Unit Rep.2341, 1988

    Google Scholar 

  32. Pokharel P., Xu J., Erdogmus D., Principe J., A closed form solution for a nonlinear Wiener filter, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulose, France, 2006.

    Google Scholar 

  33. Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000.

    Google Scholar 

  34. Ross T., Worrell S., Velten V., Mossing J., Bryant M., Standard SAR ATR evaluation experiments using the MSTAR public release data set, in: Proceedings of the SPIE, vol. 3370, 1998, pp. 566–573.

    Google Scholar 

  35. Santos J., Alexandre L., Sa J., The error entropy minimization algorithm for neural network classification, in A. Lofti (Ed.), Int Conf. Recent Advances in Soft Computing, pp. 92–97, 2004.

    Google Scholar 

  36. Schölkopf, B., Burges, C.J.C., Smola, A.J. (Eds.), Advances in Kernel Methods: Support Vector Learning. MIT Press, Cambridge, MA, 1999.

    Google Scholar 

  37. Schölkopf B. and Smola A., Learning with Kernels. MIT Press, Cambridge, MA, 2002

    Google Scholar 

  38. Shawe-Taylor J. Cristianini N., Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge, UK, 2004.

    Book  Google Scholar 

  39. Silverman B., Density Estimation for Statistics and Data Analysis, Chapman and Hall, London, 1986.

    Book  MATH  Google Scholar 

  40. Vapnik V., The Nature of Statistical Learning Theory, Springer-Verlag, New York, 1995

    Book  MATH  Google Scholar 

  41. Wang D., Brown G., Computational Auditory Scene Analysis—Principles, Algorithms, and Applications. Wiley, New York, 2006.

    Google Scholar 

  42. Wu H., Principe J., Simultaneous diagonalization in the frequency domain for source separation, Proc. First Int. Workshop on Ind. Comp. Anal. ICA’99, 245–250, Aussois, France, 1999.

    Google Scholar 

  43. Xie C., Savvides M., Kumar B., Kernel correlation filter based redundant class dependence feature analysis on FRGC2.0 data, in: Proc. second Int. Workshop Analysis Modeling Faces Gesture (AMFG), Beijing, 2005.

    Google Scholar 

  44. Xu J., Principe J., A pitch detector based on a generalized correlation function, IEEE Trans. Audio, Speech Lang. Process., 16(8):1420–1432, 2008.

    Article  Google Scholar 

  45. Xu J., Nonlinear Signal Processing Based on Reproducing Kernel Hilbert Space, Ph.D. Thesis, University of Florida, Gainesville, 2008.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Pokharel, P., Santamaria, I., Xu, J., Jeong, Kh., Liu, W. (2010). Correntropy for Random Processes: Properties and Applications in Signal Processing. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-1570-2_11

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4419-1569-6

  • Online ISBN: 978-1-4419-1570-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics