Skip to main content

Identification of Threshold Autoregressive Moving Average Models

  • Chapter
  • First Online:
Advances in Time Series Methods and Applications

Part of the book series: Fields Institute Communications ((FIC,volume 78))

  • 1201 Accesses

Abstract

Due to the lack of a suitable modeling procedure and the difficulty to identify the threshold variable and estimate the threshold values, the threshold autoregressive moving average (TARMA) model with multi-regime has not attracted much attention in application. Therefore, the chief goal of our paper is to propose a simple and yet widely applicable modeling procedure for multi-regime TARMA models. Under no threshold case, we utilize extended least squares estimate (ELSE) and linear arranged regression to obtain a test statistic \(\hat{F}\), which is proved to follow an approximate F distribution. And then, based on the statistic \(\hat{F}\), we employ some scatter plots to identify the number and locations of the potential thresholds. Finally, the procedures are considered to build a TARMA model by these statistics and the Akaike information criterion (AIC). Simulation experiments and the application to a real data example demonstrate that both the power of the test statistic and the model-building can work very well in the case of TARMA models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.00
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akaike, H. (1974). A new look at statistical model identification. IEEE Transactions on Automatic Control, 19, 716–722.

    Article  MathSciNet  MATH  Google Scholar 

  2. Billingsley, P. (1961). The Lindeberg–Levy theorem for martingales. Proceedings of the American Mathematical Society, 12, 788–792.

    MathSciNet  MATH  Google Scholar 

  3. Brockwell, P., Liu, J., & Tweedie, R. L. (1992). On the existence of stationary threshold autoregressive moving-average processes. Journal of Time Series Analysis, 13, 95–107.

    Article  MathSciNet  MATH  Google Scholar 

  4. Christopeit, N., & Helmes, K. (1980). Strong consistency of least squares estimators in linear regression models. The Annals of Statistics, 4, 778–788.

    Article  MathSciNet  MATH  Google Scholar 

  5. Chan, K. S. (1990). Testing for threshold autoregression. Annals of Statistics, 18, 1886–1894.

    Article  MathSciNet  MATH  Google Scholar 

  6. Chan, K. S., Petruccelli, J. D., Tong, H., & Woolford, S. W. (1985). A multiple-threshold AR(1) model. Journal of Applied Probability, 22, 267–279.

    Article  MathSciNet  MATH  Google Scholar 

  7. Chan, K. S., & Tong, H. (1986). On estimating thresholds in autoregressive models. Journal of Time Series Analysis, 7, 179–190.

    Article  MathSciNet  MATH  Google Scholar 

  8. Chen, W. S. C., So, K. P. M., & Liu, F. C. (2011). A review of threshold time series models in finance. Statistics and Its Interface, 4, 167–181.

    Article  MathSciNet  MATH  Google Scholar 

  9. Cryer, J. D., & Chan, K. S. (2008). Time series analysis with applications in R. Springer texts in statistics (2nd ed.). Berlin: Springer.

    Google Scholar 

  10. de Gooijer, J. G. (1998). On threshold moving-average models. Journal of Time Series Analysis, 19, 1–18.

    Article  MathSciNet  MATH  Google Scholar 

  11. Ertel, J. E., & Fowlkes, E. B. (1976). Some algorithms for linear spline and piecewise multiple linear regression. Journal of the American Statistical Association, 71, 640–648.

    Article  MATH  Google Scholar 

  12. Goodwin, G. C., & Payne, R. L. (1977). Dynamic system identification: Experiment design and data analysis. New York, NY: Academic Press.

    MATH  Google Scholar 

  13. Haggan, V., Heravi, S. M., & Priestley, M. B. (1984). A study of the application of state-dependent models in nonlinear time series analysis. Journal of Time Series Analysis, 5, 69–102.

    Article  MathSciNet  MATH  Google Scholar 

  14. Hannan, E. J., & Rissanen, J. (1982). Recursive estimation of mixed autoregressive-moving average order. Biometrika, 69, 81–94.

    Article  MathSciNet  MATH  Google Scholar 

  15. Hansen, B. E. (2000). Sample splitting and threshold estimation. Econometrica, 68, 575–603.

    Article  MathSciNet  MATH  Google Scholar 

  16. Keenan, D. M. (1985). A Tukey nonadditivity-type test for time series nonlinearity. Biometrika, 72, 39–44.

    Article  MathSciNet  MATH  Google Scholar 

  17. Lai, T. L., & Wei, C. Z. (1982). Least squares estimates in stochastic regression models with applications to identification and control of dynamic systems. The Annals of Statistics, 10, 154–166.

    Article  MathSciNet  MATH  Google Scholar 

  18. Li, G., & Li, W. K. (2011). Testing a linear time series model against its threshold extension. Biometrika, 98, 243–250.

    Article  MathSciNet  MATH  Google Scholar 

  19. Li, D., Li, W. K., & Ling, S. (2011). On the least squares estimation of threshold autoregressive and moving-average models. Statistics and Its Interface, 4, 183–196.

    Article  MathSciNet  MATH  Google Scholar 

  20. Li, D., & Ling, S. (2012). On the least squares estimation of multiple-regime threshold autoregressive models. Journal of Econometrics, 167, 240–253.

    Article  MathSciNet  MATH  Google Scholar 

  21. Liang, R., Niu, C., Xia, Q., & Zhang, Z. (2015). Nonlinearity testing and modeling for threshold moving average models. Journal of Applied Statistics, 42, 2614–2630.

    Google Scholar 

  22. Ling, S. (1999). On the probabilistic properties of a double threshold ARMA conditional heteroskedastic model. Journal of Applied Probability, 36, 688–705.

    Article  MathSciNet  MATH  Google Scholar 

  23. Ling, S., & Tong, H. (2005). Testing a linear moving-average model against threshold moving-average models. The Annals of Statistics, 33, 2529–2552.

    Article  MathSciNet  MATH  Google Scholar 

  24. Ling, S., Tong, H., & Li, D. (2007). The ergodicity and invertibility of threshold moving-average models. Bernoulli, 13, 161–168.

    Article  MathSciNet  MATH  Google Scholar 

  25. Liu, J., & Susko, E. (1992). On strict stationarity and ergodicity of a nonlinear ARMA model. Journal of Applied Probability, 29, 363–373.

    Article  MathSciNet  MATH  Google Scholar 

  26. Ljung, L., & Soderstrom, T. (1983). Theory and practice of recursive identification. Cambridge: MIT Press.

    MATH  Google Scholar 

  27. Priestley, M. B. (1980). State-dependent models: A general approach to nonlinear time series analysis. Journal of Time Series Analysis, 1, 47–71.

    Article  MathSciNet  MATH  Google Scholar 

  28. Qian, L. (1998). On maximum likelihood estimators for a threshold autoregression. Journal of Statistical Planning and Inference, 75, 21–46.

    Article  MathSciNet  MATH  Google Scholar 

  29. Tong, H. (1978). On a threshold model. In C. H. Chen (Ed.), Pattern recognition and signal processing (pp. 101–141). Amsterdam: Sijthoff and Noordhoff.

    Google Scholar 

  30. Tong, H., & Lim, K. S. (1980). Threshold autoregressions, limit cycles, and data. Journal of the Royal Statistical Society B, 42, 245–292.

    MATH  Google Scholar 

  31. Tong, H. (1990). Non-linear time series: A dynamical system approach. Oxford: Oxford University Press.

    MATH  Google Scholar 

  32. Tsay, R. S. (1986). Nonlinearity tests for time series. Biometrika, 73, 461–466.

    Article  MathSciNet  MATH  Google Scholar 

  33. Tsay, R. S. (1987). Conditional heteroscedastic time series models. Journal of the American Statistical Association, 82, 590–604.

    Article  MathSciNet  MATH  Google Scholar 

  34. Tsay, R. S. (1989). Testing and modeling threshold autoregressive process. Journal of the American Statistical Association, 84, 231–240.

    Article  MathSciNet  MATH  Google Scholar 

  35. Tsay, R. S. (2005). Analysis of financial time series (2nd ed.). London: Wiley.

    Book  MATH  Google Scholar 

  36. Wong, C. S., & Li, W. K. (1997). Testing for threshold autoregression with conditional heteroscedasticity. Biometrika, 84, 407–418.

    Article  MathSciNet  MATH  Google Scholar 

  37. Wong, C. S., & Li, W. K. (2000). Testing for double threshold autoregressive conditional heteroscedastic model. Statistica Sinica, 10, 173–189.

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We thank the two Referees for their criticisms and suggestions which have led to improvements of the paper. The research of Qiang Xia was supported by National Social Science Foundation of China (No:12CTJ019) and Ministry of Education in China Project of Humanities and Social Sciences (Project No.11YJCZH195).The research of Heung Wong was supported by a grant of the Research Committee of The Hong Kong Polytechnic University (Code: G-YBCV).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Heung Wong .

Editor information

Editors and Affiliations

Appendix: Proofs of Theorems

Appendix: Proofs of Theorems

Proof

(Theorem 1) For each regime of model (1.1), under the Assumption 4.1, we substitute the fitted residuals \(\{\hat{\varepsilon }^{(j)}_{t-i}, i=1,\ldots ,q_j\}\) for \(\{\varepsilon ^{(j)}_{t-i}, i=1,\ldots ,q_j\}\) using ELSE. Then, in every regime, model (1.1) is the linear regression model, we can obtain least squares estimate \(\hat{\varPhi }^{(j)}\) of jth regime. Under the condition of Assumptions 1–4, they are fulfilled to the condition of Theorem 1 of Lai and Wei [17] and Theorem 2 of Liang et al. [21]. Therefore, for given k, d, and the threshold values \(r_j\), the least squares estimates \(\{\hat{\varPhi }^{(j)}, j=1,2,\ldots ,l\}\) converge to \(\{\varPhi ^{(j)}, j=1,2,\ldots ,l\}\) almost surely. \(\square \)

Proof

(Theorem 2) Consider the observation \(\{y_t, t=1, 2, \ldots , n\}\) and define

$$X_t = (1,y_{t-1},\ldots ,y_{t-p},\hat{\varepsilon }_{t-1},\ldots ,\hat{\varepsilon }_{t-q}),$$

with \(\hat{\varepsilon }_{t-i}\)’s being the residuals for model (1.1) fitted by the Hannan–Rissanen algorithm or ELSE.

Also define \(\varPhi , A_n, V_n\) by

$$\varPhi ' = (\phi _0, \phi _1, \ldots , \phi _p, \theta _1, \ldots , \theta _q),$$
$$A_n = (n-p-q)^{-1}\sum X_t'X_t, \quad V_n = (n-p-q)^{-1}\sum X_t'y_t,$$

Therefore, without loss of generality, the least squares estimate of \(\varPhi \) and the residuals are

$$\hat{\varPhi } = {A^{-1}_n}V_n,$$
$$\hat{e}_t = y_t - \hat{y}_t = X_t\varPhi + e_t - X_t\hat{\varPhi } = X_t(\varPhi - \hat{\varPhi }) + e_t.$$

Also define \(\varPsi _n\) and \(\hat{a}_t\) by

$$\varPsi _n = (n-p-q)^{-1}\sum X_t'\hat{e}_t$$
$$= (n-p-q)^{-1}\sum X_t'X_t(\varPhi - \hat{\varPhi })+ (n-p-q)^{-1}\sum X_t'e_t$$
$$= {A_n}(\varPhi - \hat{\varPhi })+ (n-p-q)^{-1}\sum X_t'e_t,$$
$$ \hat{a}_t = \hat{e}_t - X_t{A^{-1}_n}\varPsi _n.\quad \quad \quad \quad \quad \quad $$

Hence,

$$\begin{aligned} \begin{array}{llllllll} \big (\sum \hat{e}^2_t-\sum \hat{a}^2_t\big )/(n-p-q) \\ = \big [\sum \hat{e}^2_t - \sum (\hat{e}_t - X_t{A^{-1}_n}\varPsi _n)^2 \big ]/(n-p-q)\\ = \big [\sum \hat{e}_t'\hat{e}_t - \sum (\hat{e}_t - X_t{A^{-1}_n}\varPsi _n)'(\hat{e}_t - X_t{A^{-1}_n}\varPsi _n)\big ]/(n-p-q)\\ = \big [2\varPsi _n'{A^{-1}_n}\sum X_t'\hat{e}_t - (n-p-q) \varPsi _n'{A^{-1}_n}\varPsi _n\big ]/(n-p-q)\\ =\varPsi _n'{A^{-1}_n}\varPsi _n\\ = \big [{A_n}(\varPhi - \hat{\varPhi }) + (n-p-q)^{-1}\sum X_t'e_t\big ]'{A^{-1}_n}\big [{A_n}(\varPhi - \hat{\varPhi }) + (n-p-q)^{-1}\sum X_t'e_t\big ] \end{array} \end{aligned}$$
(3.6)

Because \(X_t\) depends on \(\{y_{t-k}; \hat{\varepsilon }_{t-l}, k = 1,\ldots ,p, l = 1,\ldots ,q\}\), which is independent of \(e_t\). \((n-p-q)^{-\frac{1}{2}}\sum X_t'e_t\) forms a stationary and ergodic martingale difference process. Then \((n-p-q)^{-\frac{1}{2}}\sum X_t'e_t\) follows asymptotic normality according to a multivariate version of a martingale central limit theorem [2]. Theorem 2.1 shows \(\varPhi - \hat{\varPhi } \rightarrow ^{a.s.} o(1)\). \(X_t\) is \(p+q+1\) dimensional, therefore, (4) follows approximately an F random variable with degrees of freedom \(p + q + 1\) and \(n-d-b-p-q-h\). As another point of view, it is obvious that \(\dfrac{\sum \hat{a}^2_t}{(n-p-q)\sigma ^2}\) is a chi-square random variable with degrees of freedom \(n-d-b-h-p-q\). Also, the numerator and denominator of (4) have the same asymptotic variance \(\sigma ^2\). Then \((p + q + 1)\hat{F}(p, q, d)\) is asymptotically a chi-square random variable with degrees of freedom \(p + q + 1\), which is a straightforward generalization of Corollary 3.1 of Keenan [16] or Tsay [32]. Theorem 2.2 is proved. \(\square \)

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Science+Business Media New York

About this chapter

Cite this chapter

Xia, Q., Wong, H. (2016). Identification of Threshold Autoregressive Moving Average Models. In: Li, W., Stanford, D., Yu, H. (eds) Advances in Time Series Methods and Applications . Fields Institute Communications, vol 78. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-6568-7_9

Download citation

Publish with us

Policies and ethics