Advertisement

Correntropy for Random Processes: Properties and Applications in Signal Processing

  • Puskal Pokharel
  • Ignacio Santamaria
  • Jianwu Xu
  • Kyu-hwa Jeong
  • Weifeng Liu
Chapter
Part of the Information Science and Statistics book series (ISS)

Abstract

The previous chapter defined cross-correntropy for the case of a pair of scalar random variables, and presented applications in statistical inference. This chapter extends the definition of correntropy for the case of random (or stochastic) processes, which are index sets of random variables. In statistical signal processing the index set is time; we are interested in random variables that are a function of time and the goal is to quantify their statistical dependencies (although the index set can also be defined over inputs or channels of multivariate random variables). The autocorrelation function, which measures the statistical dependency between random variables at two different times, is conventionally utilized for this goal. Hence, we generalize the definition of autocorrelation to an autocorrentropy function. The name correntropywas coined to reflect the fact that the function “looks like” correlation but the sum over the lags (or over dimensions of the multivariate random variable) is the information potential (i.e., the argument of Renyi’s quadratic entropy). The definition of cross-correntropy for random variables carries over to time series with a minor but important change in the domain of the variables that now are an index set of lags. When it is clear from the context, we simplify the terminology and refer to the different functions (autocorrentropy, or crosscorrentropy) simply as correntropy function, but keep the word “function” to distinguish them from Chapter 10 quantities.

Keywords

Feature Space Autocorrelation Function Training Image Synthetic Aperture Radar Image Blind Source Separation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 5.
    Amari S., Cichocki A., Yang H., A new learning algorithm for blind signal separation. Advances in Neural Information Processing Systems, vol. 8 pp. 757–763, MIT Press, Cambridge, MA, 1996.Google Scholar
  2. 8.
    Aronszajn N., Theory of reproducing kernels, Trans. of the Amer. Math. Soc., 68(3):337–404, 1950.CrossRefMATHMathSciNetGoogle Scholar
  3. 15.
    Bagshaw P., Paul Bagshaw’s database for evaluating pitch determination algorithms. Available online at http://www.cstr.ed.ac.uk/research/projects/fda.
  4. 22.
    Beadle E., Schroeder J., Moran B., An overview of Renyi’s entropy and some potential applications, 42 nd Asilomar Conference on Signals, Systems and Computers, October 2008.Google Scholar
  5. 29.
    Belouchrani A., Abed-Meraim K., Cardoso J., Moulines E., A blind source separation technique using second-order statistics, IEEE Trans. Signal Process. 45(2):434–444, 1997.CrossRefGoogle Scholar
  6. 34.
    Bercher J., Vignat C., Estimating the Entropy of a Signal with Applications, IEEE Trans. Signal Process., 48(6):1687–1694, 2000.CrossRefGoogle Scholar
  7. 42.
    Brown J., Puckette M., Calculation of a “narrowed” autocorrelation function, J. Acoust. Soc. Am., 85:1595–1601, 1989.CrossRefGoogle Scholar
  8. 46.
    Cardoso J., Souloumiac A., Blind beamforming for non-Gaussian signals, Radar Signal Process., IEE Proc. F, 140(6):362–370, December 1993.CrossRefGoogle Scholar
  9. 62.
    Comon P., Independent component analysis, a new concept?, Signal Process., 36(3):287–314, 1994.CrossRefMATHGoogle Scholar
  10. 123.
    Gretton, A., Herbrich R., Smola A., Bousquet O., Schölkopf B., Kernel Methods for Measuring Independence,” J. Mach. Learn. Res., 6:2075–2129, 2005.MATHMathSciNetGoogle Scholar
  11. 134.
    Hardoon D., Szedmak S., Shawe-Taylor J., Canonical correlation analysis: an overview with application to learning methods, Neur. Comput., 16(12):2664–2699, Dec. 2004.CrossRefGoogle Scholar
  12. 147.
    Hess W., Pitch Determination of Speech Signals. Springer, New York, 1993.Google Scholar
  13. 150.
    Hild II K., Erdogmus D., Principe J., An analysis of entropy estimators for blind source separation, Signal Process., 86(1):182–194, 2006.CrossRefMATHGoogle Scholar
  14. 155.
    Hyvarinen A., Fast and Robust Fixed-Point Algorithms for Independent Component Analysis, IEEE Trans. Neural Netw., 10(3):626–634, 1999.CrossRefGoogle Scholar
  15. 166.
    Jeong K.H., Liu W., Principe J., The correntropy MACE filter, Pattern Recogn., 42(5):871–885, 2009.CrossRefMATHGoogle Scholar
  16. 167.
    Jeong K.W., Principe J., Enhancing the correntropy MACE filter with random projections, Neurocomputing, 72(1–3):102–111, 2008.CrossRefGoogle Scholar
  17. 172.
    Jutten C., Herault J., Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture. Signal Process., 24:1–10, 1991.CrossRefMATHGoogle Scholar
  18. 189.
    Kumar B., Minimum variance synthetic discriminant functions, J. Opt. Soc. Am., A3(10):1579–1584, 1986.CrossRefGoogle Scholar
  19. 190.
    Kumar B., Tutorial survey of composite filter designs for optical correlators, Appl. Opt., 31:4773–4801, 1992.CrossRefGoogle Scholar
  20. 191.
    Kumar B., Savvides M., Xie C., Venkataramani K., Biometric verification with correlation filters, Appl. Opt., 43(2):391–402, 2004.CrossRefGoogle Scholar
  21. 198.
    Li R., Liu W., Principe J., A unifying criterion for blind source separation based on correntropy, Signal Process., Special Issue on ICA, 8(78):1872–1881.Google Scholar
  22. 201.
    Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.CrossRefMathSciNetGoogle Scholar
  23. 203.
    Loève, M.M., Probability Theory, VanNostrand, Princeton, NJ, 1955.MATHGoogle Scholar
  24. 209.
    Mahalanobis A., Kumar B., Casasent D., Minimum average correlation energy filters, Appl. Opt., 26(17):3633–3640, 1987.CrossRefGoogle Scholar
  25. 210.
    Mahalanobis A., Forman A., Bower M., Cherry R., Day N., Multi-class SAR ATR using shift invariant correlation filters, Pattern Recogn., 27:619–626 Special Issue on Correlation Filters and Neural Networks, 1994.Google Scholar
  26. 211.
    Mardia K., Jupp P., Directional Statistics, Wiley, New York, 2000.MATHGoogle Scholar
  27. 217.
    Mercer J., Functions of positive and negative type, and their connection with the theory of integral equations, Philosoph. Trans. Roy. Soc. Lond., 209:415–446, 1909.CrossRefMATHGoogle Scholar
  28. 219.
    Mika S., Ratsch G., Weston J., Scholkopf B., Muller K., Fisher discriminant analysis with kernels. In Proceedings of IEEE International Workshop on Neural Networks for Signal Processing, pages 41–48, Madison, USA, August 23–25, 1999.Google Scholar
  29. 235.
    Papoulis A., Probability, Random Variables and Stochastic Processes, McGraw-Hill, New York, 1965.MATHGoogle Scholar
  30. 238.
    Parzen E., Statistical inference on time series by Hilbert space methods, Tech. Report 23, Stat. Dept., Stanford Univ., 1959.Google Scholar
  31. 242.
    Patterson R., Holdsworth J., Nimmo-Smith I., Rice P., SVOS final report, Part B: Implementing a gammatone filterbank, Appl. Psychol. Unit Rep.2341, 1988Google Scholar
  32. 248.
    Pokharel P., Xu J., Erdogmus D., Principe J., A closed form solution for a nonlinear Wiener filter, Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing, Toulose, France, 2006.Google Scholar
  33. 253.
    Principe J., Euliano N., Lefebvre C., Neural Systems: Fundamentals through Simulations, CD-ROM textbook, John Wiley, New York, 2000.Google Scholar
  34. 273.
    Ross T., Worrell S., Velten V., Mossing J., Bryant M., Standard SAR ATR evaluation experiments using the MSTAR public release data set, in: Proceedings of the SPIE, vol. 3370, 1998, pp. 566–573.Google Scholar
  35. 283.
    Santos J., Alexandre L., Sa J., The error entropy minimization algorithm for neural network classification, in A. Lofti (Ed.), Int Conf. Recent Advances in Soft Computing, pp. 92–97, 2004.Google Scholar
  36. 288.
    Schölkopf, B., Burges, C.J.C., Smola, A.J. (Eds.), Advances in Kernel Methods: Support Vector Learning. MIT Press, Cambridge, MA, 1999.Google Scholar
  37. 289.
    Schölkopf B. and Smola A., Learning with Kernels. MIT Press, Cambridge, MA, 2002Google Scholar
  38. 294.
    Shawe-Taylor J. Cristianini N., Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge, UK, 2004.CrossRefGoogle Scholar
  39. 300.
    Silverman B., Density Estimation for Statistics and Data Analysis, Chapman and Hall, London, 1986.CrossRefMATHGoogle Scholar
  40. 323.
    Vapnik V., The Nature of Statistical Learning Theory, Springer-Verlag, New York, 1995CrossRefMATHGoogle Scholar
  41. 329.
    Wang D., Brown G., Computational Auditory Scene Analysis—Principles, Algorithms, and Applications. Wiley, New York, 2006.Google Scholar
  42. 335.
    Wu H., Principe J., Simultaneous diagonalization in the frequency domain for source separation, Proc. First Int. Workshop on Ind. Comp. Anal. ICA’99, 245–250, Aussois, France, 1999.Google Scholar
  43. 338.
    Xie C., Savvides M., Kumar B., Kernel correlation filter based redundant class dependence feature analysis on FRGC2.0 data, in: Proc. second Int. Workshop Analysis Modeling Faces Gesture (AMFG), Beijing, 2005.Google Scholar
  44. 342.
    Xu J., Principe J., A pitch detector based on a generalized correlation function, IEEE Trans. Audio, Speech Lang. Process., 16(8):1420–1432, 2008.CrossRefGoogle Scholar
  45. 343.
    Xu J., Nonlinear Signal Processing Based on Reproducing Kernel Hilbert Space, Ph.D. Thesis, University of Florida, Gainesville, 2008.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Puskal Pokharel
    • 1
  • Ignacio Santamaria
    • 1
  • Jianwu Xu
    • 1
  • Kyu-hwa Jeong
    • 1
  • Weifeng Liu
    • 1
  1. 1.Dept. Electrical Engineering & Biomedical EngineeringUniversity of FloridaGainesvilleUSA

Personalised recommendations