Advertisement

The Doubly Adaptive LASSO for Vector Autoregressive Models

  • Zi Zhen Liu
  • Reg KulpergerEmail author
  • Hao Yu
Chapter
  • 956 Downloads
Part of the Fields Institute Communications book series (FIC, volume 78)

Abstract

The LASSO (Tibshirani, J R Stat Soc Ser B 58(1):267–288, 1996, [30]) and the adaptive LASSO (Zou, J Am Stat Assoc 101:1418–1429, 2006, [37]) are popular in regression analysis for their advantage of simultaneous variable selection and parameter estimation, and also have been applied to autoregressive time series models. We propose the doubly adaptive LASSO (daLASSO), or PLAC-weighted adaptive LASSO, for modelling stationary vector autoregressive processes. The procedure is doubly adaptive in the sense that its adaptive weights are formulated as functions of the norms of the partial lag autocorrelation matrix function (Heyse, 1985, [17]) and Yule–Walker or ordinary least squares estimates of a vector time series. The existing papers ignore the partial lag autocorrelation information inherent in a VAR process. The procedure shows promising results for VAR models. The procedure excels in terms of VAR lag order identification.

Keywords

Adaptive LASSO Asymptotic normality Estimation consistency LASSO Oracle property Doubly adaptive LASSO PLAC-weighted adaptive LASSO Selection consistency VAR VAR time series Vector auroregressive processes Teacher–Student dual 

Mathematics Subject Classifications (2010)

62E20 62F10 62F12 62H12 62J07 62M10 

Notes

Acknowledgements

We sincerely thank two anonymous referees for their valuable comments and suggestions that we have adopted to improve this manuscript greatly.

References

  1. 1.
    Akaike, H. (1969). Fitting autoregressive models for prediction. Annals of the Institute of Statistical Mathematics, 21, 243–247.MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, AC–19, 716–723.Google Scholar
  3. 3.
    Akaike, H. (1978). A Bayesian analysis of the minimum AIC procedure. Annals of the Institute of Statistical Mathematics, 30(Part A), 9–14.Google Scholar
  4. 4.
    Billingsley, P. (1961). The Lindeberg-Levy theorem for martingales. Proceedings of the American Mathematical Society, 12, 788–792.MathSciNetzbMATHGoogle Scholar
  5. 5.
    Caner, M., & Knight, K. (2013). An alternative to unit root tests: bridge estimators differentiate between nonstationary versus stationary models and select optimal lag. Journal of Statistical Planning and Inference, 143, 691–715.MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Chand, S. (2011). Goodness of fit and lasso variable selection in time series analysis. Ph.D. thesis, University of Nottingham.Google Scholar
  7. 7.
    Chen, K., & Chan, K. (2011). Subset ARMA selection via the adaptive Lasso. Statistics and Its Interface, 4, 197–205.MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Donoho, D. L., Michael Elad, M., & Temlyakov, V. N. (2006). Stable recovery of sparse overcomplete representations in the presence of noise. IEEE Transactions on Iinformation Theory, 52(1), 6–18.MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Durbin, J. (1960). The fitting of time series models. Review of the Institute of International Statistics, 28, 233–244.CrossRefzbMATHGoogle Scholar
  10. 10.
    Efron, B., Johnstone, I., & Tibshirani, R. (2004). Least angle regression. Annals of Statistics, 32(2), 407–499.MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Fan, J., & Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, 96, 1348–1360.MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Fujita, A., Sato, J. R., Garay-Malpartida, H. M., Yamaguchi, R., Miyano, S., Sogayar, M. C., et al. (2007). Modeling gene expression regulatory networks with the sparse vector autoregressive model. BMC Systems Biology, 1, 39.CrossRefGoogle Scholar
  13. 13.
    Geyer, C. (1994). On the asymptotics of constrained M-estimation. The Annals of Statistics, 22, 1993–2010.MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Hannan, E. J. (1970). Multiple time series. New York: Wiley.CrossRefzbMATHGoogle Scholar
  15. 15.
    Hannan, E. J., & Quinn, B. G. (1979). The determination of the order of an autoregression. Journal of the Royal Statistical Society, B41, 190–195.MathSciNetzbMATHGoogle Scholar
  16. 16.
    Haufem, N. K. S., Muller, S. K, Nolte, G., & Kramer. (2008). Sparse causal discovery in multivatiate time series. In JMLR: Workshop and conference proceedings (Vol. 1, pp. 1–16).Google Scholar
  17. 17.
    Heyse, J. F.(1985). Partial lag autocorrelation and partial process autocorrelation for vector time series, with applications. Ph.D. dissertation, Temple University.Google Scholar
  18. 18.
    Hsu, N., Hung, H., & Chang, Y. (2008). Subset selection for vector autoregressive processes using LASSO. Computational Statistics and Data Analysis, 52, 3645–3657.MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Knight, K., & Fu, W. (2000). Asymptotics for LASSO-type estimators. The Annals of Statistics, 28, 1356–1378.MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Kock, A. B. (2012). On the oracle property of the adaptive lasso in stationary and nonstationary autoregressions. CREATES research papers 2012-05, Aarhus University.Google Scholar
  21. 21.
    Kock, A. B., & Callot, L. A. F. (2012). Oracle inequalities for high dimensional vector autoregressions. CREATES research paper 2012-12, Aarhus University.Google Scholar
  22. 22.
    Liu, Z. Z. (2014). The doubly adaptive LASSO methods for time series analysis. University of Western Ontario - Electronic Thesis and Dissertation Repository. Paper 2321.Google Scholar
  23. 23.
    Lütkepohl, H. (2006). New introduction to multiple time series analysis. Berlin: Springer.zbMATHGoogle Scholar
  24. 24.
    Medeiros, M. C, & Mendes, E. F. (2012). Estimating high-dimensional time series models. CREATES research paper 2012-37.Google Scholar
  25. 25.
    Nardi, Y., & Rinaldo, A. (2011). Autoregressive process modeling via the LASSO procedure. Journal of Multivariate Analysis, 102(3), 528–549.MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Park, H., & Sakaori, F. (2013). Lag weighted lasso for time series model. Computational Statistics, 28, 493–504.MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Ren, Y., & Zhang, X. (2010). Subset selection for vector autoregressive processes via the adaptive LASSO. Statistics and Probability Letters, 80, 1705–1712.MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Schwarz, G. (1978). Estimating the dimension of a model. Annals of Statistics, 6(2), 461–464.MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    Song, S., & Bickel, P. J. (2011). Large vector auto regressions. arXiv:1106.3915v1 [stat.ML].
  30. 30.
    Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, 58(1), 267–288.MathSciNetzbMATHGoogle Scholar
  31. 31.
    Valdés-Sosa, P. A., Sánchez-Bornot, J. M., Lage-Castellanos, A., Vega-Hernández, M., Bosch-Bayard, J., Melie-Garía, L., et al. (2005). Estimating brain functional connectivity with sparse multivariate autoregression. Philosophical Transactions Royal Society B, 360(1457), 969–981.CrossRefGoogle Scholar
  32. 32.
    Wang, H., Li, G., & Tsai, C. (2007). Regression coefficients and autoregressive order shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, 69(1), 63–78.MathSciNetCrossRefGoogle Scholar
  33. 33.
    Wei, W. S. (2005). Time series analysis: Univariate and multivariate methods (2nd ed.). Reading, MA: Addison-Wesley.Google Scholar
  34. 34.
    White, H. (2001). Asymptotic theory for econometricians (Revised ed.). New York: Academic Press.Google Scholar
  35. 35.
    Yoon, Y., Park, C., & Lee, T. (2013). Penalized regression models with autoregressive error terms. Journal of Statistical Computation and Simulation, 83(9), 1756–1772.MathSciNetCrossRefGoogle Scholar
  36. 36.
    Zhao, P., & Yu, B. (2006). On model selection consistency of Lasso. Journal of Machine Learning Research, 7, 2541–2563.MathSciNetzbMATHGoogle Scholar
  37. 37.
    Zou, H. (2006). The adaptive LASSO and its oracle properties. Journal of the American Statistical Association, 101, 1418–1429.MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Department of MathematicsTrent UniversityPeterboroughCanada
  2. 2.Department of Statistical and Actuarial SciencesUniversity of Western OntarioLondonCanada

Personalised recommendations