Advertisement

METRON

pp 1–10 | Cite as

A recursive approach for determining matrix inverses as applied to causal time series processes

  • Serge B. ProvostEmail author
  • John N. Haddad
Article
  • 5 Downloads

Abstract

A decomposition of a certain type of positive definite quadratic forms in correlated normal random variables is obtained from successive applications of blockwise inversion to the leading submatrices of a symmetric positive definite matrix. This result can be utilized to determine Mahalanobis-type distances and allows for the calculation of the full likelihood functions in instances where the observations secured from certain causal processes are irregularly spaced or incomplete. Applications to some autoregressive moving-average models are pointed out and an illustrative numerical example is presented.

Keywords

Matrix inverse Quadratic forms Mahalanobis distance Craig’s theorem Likelihood function ARMA processes 

Mathematics Subject Classification

Primary: 62M10 15A09 Secondary: 15A63 15B05 

Notes

Acknowledgements

We would like to express our sincere thanks to both referees for their thorough reviews, insightful comments and valuable suggestions. The financial support of the Natural Sciences and Engineering Research Council of Canada is gratefully acknowledged by the first author.

References

  1. 1.
    Bernstein, D.: Matrix Mathematics. Princeton University Press, Princeton (2005)Google Scholar
  2. 2.
    Box, G.E.P., Jenkins, G.M., Reinsel, G.P.: Time Series Analysis: Forecasting and Control, 4th edn. Wiley, New Jersey (2008)CrossRefzbMATHGoogle Scholar
  3. 3.
    Cosme, I., Fernandes, I.F., de Carvalho, J.L., Xavier-de Souza, S.: Memory-usage advantageous block recursive matrix inverse. Appl. Math. Comput. 238, 125–136 (2018)MathSciNetGoogle Scholar
  4. 4.
    Fan, J.: Non-central Cochran’s theorem for elliptically contoured distributions. Acta Math. Sinica New Ser. 2, 185–198 (1986)CrossRefzbMATHGoogle Scholar
  5. 5.
    Fan, J., Liao, Y., Mincheva, M.: Large covariance estimation by thresholding principal orthogonal complements. J. R. Stat. Soc. 75, 603–680 (2013)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Haddad, J.N.: The recursive property of the inverse of the covariance matrix of a moving average process of general order. J. Time Ser. Anal. 16, 551–554 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Haddad, J.N.: On the closed form of the covariance matrix and its inverse of a causal ARMA process. J. Time Ser. Anal. 25, 443–448 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Haddad, J.N., Rached, Z.S., Jajou, A.F.: The full likelihood function of a linear stationary Gaussian process. Int. J. Math. Comput. Sci. 13, 9–14 (2018)MathSciNetGoogle Scholar
  9. 9.
    Harvey, A., Pierse, R.: Estimating missing observations in econometric time series. J. Am. Stat. Assoc. 79, 125–131 (1984)CrossRefGoogle Scholar
  10. 10.
    Hogg, R.V., McKean, J.W., Craig, A.T.: Introduction to Mathematical Statistics, 7th edn. Pearson, New York (2013)Google Scholar
  11. 11.
    Horn, R., Johnson, C.: Matrix Analysis. Cambridge University Press, Cambridge (2013)zbMATHGoogle Scholar
  12. 12.
    Huang, S., Li, J., Sun, L., Ye, J., Fleisher, A., Wu, T., Chen, K., Reiman, E.: Learning brain connectivity of Alzheimer’s disease by sparse inverse covariance estimation. NeuroImage. 50, 935–949 (2010)CrossRefGoogle Scholar
  13. 13.
    Jones, R.H., Ackerson, L.M.: Serial correlation in unequally spaced longitudinal data. Biometrika. 77, 721–731 (1990)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Marjanovic, G., Solo, V.: On \(l_q\) optimization and sparse inverse covariance selection. IEEE Trans. Signal Process. 62, 1644–1654 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Mathai, A.M., Provost, S.B.: Quadratic Forms in Random Variables. Theory and Applications. Marcel Dekker, New York (1992)zbMATHGoogle Scholar
  16. 16.
    Proietti, T., Giovannelli, A.: A Durbin-Levinson regularized estimator of high dimensional autocovariance matrices, pp. 1–23. Department of Economics and Business Economics, Aarhus (2017)zbMATHGoogle Scholar
  17. 17.
    Provost, S.B.: On Craig’s theorem and its generalizations. J. Stat. Plan. Inference. 53, 311–321 (1996)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Thompson, P.J., de Souza, P.: Speech recognition using LPC distance measures. In: Hannan, E.J., Krishnaiah, P.R., Rao, M.M. (eds.) Handbook of Statistics 5, Time Series in the Time Domain, pp. 389–412. North Holland, Amsterdam (1985)CrossRefGoogle Scholar
  19. 19.
    Wu, W.B., Pourahmadi, M.: Banding sample autocovariance matrices of stationary processes. Stat. Sinica 19, 1755–1768 (2009)MathSciNetzbMATHGoogle Scholar

Copyright information

© Sapienza Università di Roma 2019

Authors and Affiliations

  1. 1.Department of Statistical and Actuarial SciencesThe University of Western OntarioLondonCanada
  2. 2.Department of Mathematics and StatisticsNotre Dame University-LouaizeZouk MosbehLebanon

Personalised recommendations