Advertisement

Performance Analysis

  • S. Uṇṇikrishṇa Pillai
  • C. S. Burrus
Part of the Signal Processing and Digital Filtering book series (SIGNAL PROCESS)

Abstract

So far we have assumed that exact ensemble averages of the array output covariances (or mean values) are available, and based on this assumption, several conventional and high resolution techniques have been developed to resolve the directions of arrival of incoming signals. In this chapter we analyze the performance of these methods based on finite observations, from a statistical point of view, and establish several results. When the ensemble averages are not available, they usually are estimated from the available array output data. In general, a finite data sample is available and estimation is carried out for the unknown covariances of interest such that the resulting estimators represent their “most likely” values. The principle of maximum likelihood (ML) is often chosen in various estimation and hypothesis testing problems for this purpose [1]. As the name implies, the unknowns are selected so as to maximize (the logarithm of) the likelihood function, which is (the logarithm of) the joint probability density function of observations.

Keywords

Angular Separation Joint Probability Density Function Signal Subspace Arrival Angle Favorable Configuration 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    H. L. Van Trees, Detection, Estimation, and Modulation Theory, Part I. New York: John Wiley and Sons, 1968.MATHGoogle Scholar
  2. [2]
    V. K. Rohatgi, An Introduction to Probability Theory and Mathematical Statistics. New York: John Wiley and Sons, 1976.MATHGoogle Scholar
  3. [3]
    R. J. Bickel and K. A. Doksum, Mathematical Statistics. San Francisco, CA: Holden-Day, 1977.MATHGoogle Scholar
  4. [4]
    T. W. Anderson, An Introduction to Multivariate Statistical Analysis, 2nd ed. New York: John Wiley and Sons, 1984.MATHGoogle Scholar
  5. [5]
    N. R. Goodman, “Statistical analysis based on a certain multivariate complex Gaussian distribution (an introduction),” Ann. Math. Stat., vol. 34, pp. 152–177, 1963.MATHCrossRefGoogle Scholar
  6. [6]
    A. B. Baggeroer, “Confidence intervals for regression (MEM) spectral estimates”, IEEE Trans. Inform. Theory, vol. IT-22, pp. 534–545, Sept. 1976.MathSciNetCrossRefGoogle Scholar
  7. [7]
    R. P. Gupta, “Asymptotic theory for principal component analysis in the complex case”, J. Indian Stat. Assoc., vol. 3, pp. 97–106, 1965.Google Scholar
  8. [8]
    R. A. Monzingo and T. W. Miller, Introduction to Adaptive Arrays. New York: John Wiley and Sons, 1980.Google Scholar
  9. [9]
    I. S. Reed, “On a moment theorem for complex Gaussian processes,” IRE Trans. Inform. Theory, pp. 194–195, Apr. 1962.Google Scholar
  10. [10]
    M. Kaveh and A. J. Barabell, “The statistical performance of the MUSIC and the minimum-norm algorithms in resolving plane waves in noise,” IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP-34, pp. 331–341, Apr. 1986.CrossRefGoogle Scholar
  11. [11]
    S. U. Pillai and B. H. Kwon, “Performance analysis of eigenvector-based high resolution estimators for direction finding in correlated and coherent scienes,” in ONR Annual Report, Polytechnic University, June 1988.Google Scholar
  12. [12]
    M. Okamato, “Distinctness of the eigenvalues of a quadratic form in a multivariate sample,” Annals of Statistics, vol. 1, pp. 763–765, 1973.MathSciNetCrossRefGoogle Scholar
  13. [13]
    T. W. Anderson, “Asymptotic theory for principal component analysis,” Ann. Math. Stat., vol. 34, pp. 122–148, 1963.MATHCrossRefGoogle Scholar
  14. [14]
    D. N. Lawley, “Estimation in fact analysis under various initial assumptions,” British Journal of Statistical Psychology, vol. II, pp. 1–12, 1958.CrossRefGoogle Scholar
  15. [15]
    S. N. Roy, Some Aspects of Multivariate Analysis. New York: John Wiley and Sons, 1957.Google Scholar
  16. [16]
    D. R. Cox and D. V. Hinkley, Theoretical Statistics. London, England: Chapman and Hall, 1974.MATHGoogle Scholar
  17. [17]
    K. V. Mardia, J. T. Kent, and J. M. Bibby, Multivariate Analysis. New York: Academic, 1979.MATHGoogle Scholar
  18. [18]
    D. N. Simkins, “Multichannel angle of arrival estimation,” Ph.D. Dissertation, Stanford Univ., Stanford, CA, 1980.Google Scholar
  19. [19]
    R. O. Schmidt, “A signal subspace approach to emitter location and spectral estimation,” Ph.D. Dissertation, Stanford Univ., Stanford, CA, Nov. 1981.Google Scholar
  20. [20]
    G. Bienvenu and L. Kopp, “Optimality of high resolution array processing using the eigensystem approach,” IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP-31, pp. 1235–1248, Oct. 1983.Google Scholar
  21. [21]
    A. M. Bruckstein, T. J. Shan, and T. Kailath, “The resolution of overlapping echos,” IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP-33, pp. 1357–1367, Dec. 1985.CrossRefGoogle Scholar
  22. [22]
    H. Akaike, “A new look at the statistical model identification,” IEEE Trans. Automat. Contr., vol. AC-19, pp. 716–723, Dec. 1974.MathSciNetCrossRefGoogle Scholar
  23. [23]
    J. Rissanen, “Modeling by shortest data description,” Automatica, vol. 14, pp. 465–471, 1978.MATHCrossRefGoogle Scholar
  24. [24]
    S. Kullback, J. C. Keegel, and J. H. Kullback, Topics in Statistical Information Theory. New York: Springer-Verlag, 1987.MATHGoogle Scholar
  25. [25]
    M. Wax and T. Kailath, “Detection of signals by information theoretic criteria,” IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP-33, pp. 387–392, Apr. 1985.MathSciNetCrossRefGoogle Scholar
  26. [26]
    L. C. Zhao, P. R. Krishnaiah, and Z. D. Bai, “On detection of the number of signals in presence of white noise,” J. Multivariate Anal., vol. 20, pp. 1–25, 1986.MathSciNetMATHCrossRefGoogle Scholar
  27. [27]
    Y. Q. Yin and P. R. Krishnaiah, “On some nonparametric methods for detection of the number of signals,” IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP-35, pp. 1533–1538, Nov. 1987.MathSciNetCrossRefGoogle Scholar
  28. [28]
    A. Kshirsagar, Multivariate Analysis. New York: Marcel Dekker, 1972.MATHGoogle Scholar
  29. [29]
    C. R. Rao, Linear Statistical Inference and its Applications, 2nd ed. New York: John Wiley and Sons, 1973.MATHCrossRefGoogle Scholar
  30. [30]
    Y. Lee, “Direction finding from first order statistics and spatial spectrum estimation,” Ph.D. dissertation, Polytechnic Univ., Brooklyn, NY, 1988.Google Scholar
  31. [31]
    I. S. Reed, J. D. Mallet, and L. E. Brennan, “Rapid convergence rate in adaptive arrays,” IEEE Trans. Aerospace Electron Syst., vol. AES-10, pp. 853–863, Nov. 1974.CrossRefGoogle Scholar
  32. [32]
    R. C. Hanumara, “An alternate derivation of the distribution of the conditioned signal-to-noise ratio,” IEEE Trans. Ant Propag., vol. AP-34, pp. 463–464, Mar. 1986.CrossRefGoogle Scholar
  33. [33]
    C. G. Khatri and C. R. Rao, “Effects of estimated noise covariance matrix in optimal signal detection,” IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP-35, pp. 671–679, May 1987.CrossRefGoogle Scholar
  34. [34]
    S. U. Pillai and B. H. Kwon, “GEESE (GEneralized Eigenvalues utilizing Signal subspace Eigenvectors) — A new technique for direction finding,” Proc. Twenty Second Annual Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, Oct. 31 – Nov. 2, 1988.Google Scholar
  35. [35]
    B. H. Kwon, “New high resolution techniques and their performance analysis for angle-of-arrival estimation,” Ph.D. dissertation, Polytechnic Univ., Brooklyn, NY, 1989.Google Scholar

Copyright information

© Springer-Verlag New York Inc. 1989

Authors and Affiliations

  • S. Uṇṇikrishṇa Pillai
    • 1
  • C. S. Burrus
    • 2
  1. 1.Department of Electrical Engineering and Computer SciencePolytechnic UniversityBrooklynUSA
  2. 2.Department of Electrical and Computer EngineeringRice UniversityHoustonUSA

Personalised recommendations