Abstract
In this paper, we introduce a first-order random coefficient integer-valued threshold autoregressive process, which is based on binomial thinning. Basic probabilistic and statistical properties of this model are discussed. Conditional least squares and conditional maximum likelihood estimators are derived for both the cases that the threshold variable is known or not. The asymptotic properties of the estimators are established. Moreover, forecasting problem is addressed. Finally, some numerical results of the estimates and a real data example are presented.
Similar content being viewed by others
References
Al-Osh, M.A., Alzaid, A.A.: First-order integer-valued autoregressive (INAR(1)) process. J. Time Ser. Anal. 8, 261–275 (1987)
Al-Osh, M.A., Alzaid, A.A.: Binomial autoregressive moving average models. Commun. Stat. Stoch. Models 7, 261–282 (1991)
Al-Osh, M.A., Alzaid, A.A.: First order autoregressive time series with negative binomial and geometric marginals. Commun. Stat. Theor. Methods 21, 2483–2492 (1992)
Alzaid, M.A., Al-Osh, A.A.: First order integer-valued autoregressive (INAR(1)) process: Distributional and regression properties. Statistica Neerlandica 42, 53–61 (1988)
Billingsley, P.: Statistical Inference for Markov Processes. The University of Chicago Press, Chicago (1961)
Du, J.G., Li, Y.: The integer-valued autoregressive (p) model. J. Time Ser. Anal. 12, 129–142 (1991)
Franke, J., Seligmann, T.: Conditional maximum likelihood estimates for INAR(1) processes and their application to modeling epileptic seizure counts. In: Subba Rao, T. (ed.) Developments in Time Series Analysis, pp. 310–330. Chapman and Hall, London (1993)
Freeland, R.K., McCabe, B.P.M.: Forecasting discrete valued low count time series. Int. J. Forecast. 20, 427–434 (2004)
Joe, H.: Time series models with univariate margins in the convolution-closed infinitely divisible class. J. Appl. Probab. 33, 664–677 (1996)
Jung, R.C., Ronning, G., Tremayne, A.R.: Estimation in conditional first order autoregression with discrete support. Stat. Papers 46, 195–224 (2005)
Klimko, L.A., Nelson, P.I.: On conditional least squares estimation for stochastic processes. Ann. Stat. 6, 629–642 (1978)
Li, D., Ling, S.: On the least squares estimation of multiple-regime threshold autoregressive models. J. Econom. 167, 240–253 (2012)
Li, D., Tong, H.: Nested sub-sample search algorithm for estimation of threshold models. Statistica Sinica 26, 1543–1554 (2016)
McKenzie, E.: Autoregressive moving-average processes with negative-binomial and geometric marginal distributions. Adv. Appl. Probab. 18, 679–705 (1986)
Monteiro, M., Scotto, M.G., Pereira, I.: Integer-valued self-exciting threshold autoregressive processes. Commun. Stat. Theor. Methods 41, 2717–2737 (2012)
Möller, T.A.: Self-exciting threshold models for time series of counts with a finite range. Stoch. Models 32, 77–98 (2016)
Möller, T.A., Silva, M.E., Weiß, C.H., et al.: Self-exciting threshold binomial autoregressive processes. AStA Adv. Stat. Anal. 100, 369–400 (2016)
Robert-Koch-Institut: SurvStat@RKI. http://www3.rki.de/SurvStat. Accessed 2014-07-02 (2014)
Scotto, M.G., Weiß, C.H., Gouveia, S.: Thinning-based models in the analysis of integer-valued time series: a review. Stat. Model. 15, 590–618 (2015)
Steutel, F., Van Harn, K.: Discrete analogues of self-decomposability and stability. Ann. Probab. 7, 893–899 (1979)
Tong, H.: On a Threshold Model. In: Chen, C.H. (ed.) Pattern Recognition and Signal Processing, pp. 575–586. Sijthoff and Noordhoff, Amsterdam (1978)
Tong, H., Lim, K.S.: Threshold autoregressive, limit cycles and cyclical data. J. R. Stat. Soc. Ser. B 42, 245–292 (1980)
Tong, H.: Threshold models in time series analysis—30 years on. Stat. Interface 4, 107–118 (2011)
Thyregod, P., Carstensen, J., Madsen, H., Arnbjerg-Nielsen, K.: Integer valued autoregressive models for tipping bucket rainfall measurements. Environmetrics 10, 295–411 (1999)
Tsay, R.S.: Testing and modeling threshold autoregressive processes. J. Am. Stat. Assoc. 84, 231–240 (1989)
Weiß, C.H.: Thinning operations for modeling time series of counts—a survey. AStA Adv. Stat. Anal. 92, 319–343 (2008)
Weiß, C.H.: The INARCH(1) model for overdispersed time series of counts. Commun. Stat. Simul. Comput. 39, 1269–1291 (2010)
Wang, C., Liu, H., Yao, J., Davis, R.A., Li, W.K.: Self-excited threshold Poisson autoregression. J. Am. Stat. Assoc. 109, 776–787 (2014)
Yang, K., Wang, D., Jia, B., Li, H.: An integer-valued threshold autoregressive process based on negative binomial thinning. Stat. Papers (2017). doi:10.1007/s00362-016-0808-1
Yu, P.: Likelihood estimation and inference in threshold regression. J. Econom. 167, 274–294 (2012)
Zheng, H., Basawa, I.V., Datta, S.: Inference for \(p\)th-order random coefficient integer-valued autoregressive processes. J. Time Ser. Anal. 27, 411–440 (2006)
Zheng, H., Basawa, I.V., Datta, S.: First-order random coefficient integer-valued autoregressive process. J. Stat. Plan. Inference 137, 212–229 (2007)
Acknowledgements
We gratefully acknowledge the anonymous reviewers for their serious work and thoughtful suggestions that have helped improve this paper substantially. We also acknowledge the financial supports by National Natural Science Foundation of China (Nos. 11271155, 11371168, J1310022, 11571138, 11501241, 11571051, 11301137, 11671168), National Social Science Foundation of China (16BTJ020), Science and Technology Research Program of Education Department in Jilin Province for the 12th Five-Year Plan (440020031139) and Jilin Province Natural Science Foundation (20150520053JH), Science and Technology Developing Plan of Jilin Province (20170101061JC).
Author information
Authors and Affiliations
Corresponding authors
Appendix
Appendix
Proof of Lemma 2.1
By the definition of the thinning operator “\(\circ \)” defined in (1.2), we have
where \(f(\cdot )\) denotes the density function of \(\phi _t\), \(\varGamma (\cdot )\) is the Gamma function, and
It is easy to verify that (i) \(0<g(m,q)<1\) and, (ii) for fixed m, g(m, q) monotonously decreases with respect to q, which implies that \(P(\phi _t\circ X=0)\) is a monotonic decreasing function with respect to q when \(0<q<k\). This completes the proof. \(\square \)
Proof of Proposition 2.1
It is easy to see that \(\{X_t\}_{t \in \mathbb {Z}}\) is a Markov chain with state space \(\mathbb {N}_0\) and transition probabilities:
where
From the expression above, it follows that the chain is irreducible and aperiodic. Furthermore, to show that \(\{X_t\}_{t \in \mathbb {Z}}\) is positive recurrent it is sufficient to prove that \(\sum _{t=1}^{\infty }P^t(0,0)=+\infty \) (since \(\{X_t\}_{t \in \mathbb {Z}}\) is irreducible) with \(P^t(x,y):=P(X_t=y|X_0=x)\). For convenience, we denote
Then (2.1) can be rewritten as
By iterating (7.3) \(t-1\) times, we have
This allows us to write
Denote \(q_{\max }=\max \{q_1,q_2\}\), let \(\phi _{L,t}\sim Beta(q_{\max },k-q_{\max })\). By Lemma 2.1 and the properties of binomial distribution, we have
By the proof of proposition 2.2 in Zheng et al. (2007), we know that \(\lim _{t\rightarrow \infty }L_t\ne 0\), which implies that \(\lim _{t\rightarrow \infty }P^t(0,0)\ne 0\). Therefore, we conclude that \(\sum _{t=1}^{\infty }P^t(0,0)=+\infty .\) This proves that \(\{X_t\}\) is a positive recurrent Markov chain (and hence ergodic) which ensures the existence of a strictly stationary distribution of (2.1). \(\square \)
Proof of Proposition 2.2
Compute to see that, under the stationary distribution,
Similarly, we have
Some similar but tedious calculations show that \(E(X_t^3) \le \infty \) and \(E(X_t^4) \le \infty \). Combining (7.4) and (7.5), one can see that \(E(X_t^k) < \infty \) for \(k = 1, 2, 3,4\). \(\square \)
Proof of Proposition 2.3
The results (i) to (iii) are straightforward to verify. We give the proof of (iv) and (v) only. (iv) The variance of \(X_t\) is given by
A direct calculation shows
Similarly, we have
and
Then, (iv) follows by replacing (7.7), (7.8) and (7.9) in (7.6) and some algebra.
(v) By the law of total covariance, we have
Thus, the autocorrelation function is \(\rho (k)=\mathrm{Corr}(X_{t},X_{t-h})=(\phi _{1}p_1+\phi _{2}p_2)^h.\)\(\square \)
Proof of Theorem 3.1
This theorem follows from Theorem 3.1 in Klimko and Nelson (1978). \(\square \)
Proof of Theorems 3.2 and 3.3
Theorems 3.2 and 3.3 are special cases of Theorems 2.1 and 2.2 in Billingsley (1961). As discussed in Franke and Seligmann (1993), we only have to check that the conditions (C1)–(C6) hold, which implies the regularity conditions of Theorems 2.1 and 2.2 in Billingsley (1961) hold.
(C1) The set \(\{k:P(Z_t=m)=f(m,\lambda )=\frac{\lambda ^m}{m!}e^{-\lambda }>0\}\) does not depend on \(\lambda \);
(C2)\(E[Z_t^3]=\lambda ^3+3\lambda ^2+\lambda <\infty \);
(C3)\(P(Z_t=m)\) is three times continuously differentiable with respect to \(\lambda \);
(C4) For any \(\lambda '\in B\), where B is an open subset of \(\mathbb {R}\), there exists a neighborhood U of \(\lambda '\) such that:
1. \(\sum _{k=0}^\infty \sup _{\lambda \in U}f(k,\lambda )<\infty \),
2. \(\sum _{k=0}^\infty \sup _{\lambda \in U}|\frac{\partial f(k,\lambda )}{\partial \lambda }|<\infty \),
3. \(\sum _{k=0}^\infty \sup _{\lambda \in U}|\frac{\partial ^2 f(k,\lambda )}{\partial \lambda ^2}|<\infty \);
(C5) For any \(\lambda '\in B\) there exists a neighborhood U of \(\lambda '\) and the sequences \(\psi _1(n)=const1 \cdot n\), \(\psi _{11}(n)=const2 \cdot n^2\), and \(\psi _{111}(n)=const3 \cdot n^3\), with suitable constants const1, const2, const3 and \(n\ge 0\) such that \(\forall \lambda \in U\) and \(\forall m\le n \), with \(P(Z_t)\) nonvanishing \(f(m,\lambda )\),
and with respect to the stationary distribution of the process \(\{X_t\}\),
(C6) Let \(\varvec{I}(\varvec{\theta })=(\sigma _{ij})_{3\times 3}\) denote the Fisher information matrix, i.e.,
\(\varvec{I}(\varvec{\theta })\) is nonsingular, where \(P(X_1,X_2)\) denotes the transition probability given in (7.1).
Franke and Seligmann (1993) proved, also can be seen in Monteiro et al. (2012), that the above conditions (C1)–(C4) are all hold. (C5) follows by proposition 2.2 and the properties of Poisson distribution. Therefore, for the RCTINAR(1) model it is only necessary to verify the last condition (analogous to condition (C6) in Franke and Seligmann 1993) also holds. To this end, we need to check the following statements are all true.
-
(S1)
\(E\left| \frac{\partial }{\partial q_i}\log P(X_1,X_2)\right| ^2 < \infty ,~i=1,2\);
-
(S2)
\(E\left| \frac{\partial }{\partial \lambda }\log P(X_1,X_2)\right| ^2 < \infty \);
-
(S3)
\(E\left| \frac{\partial }{\partial q_i}\log P(X_1,X_2)\frac{\partial }{\partial \lambda }\log P(X_1,X_2)\right| < \infty ,~i=1,2\).
We shall first prove Statement (S1). Recall that for \(i=1,2\),
where \(h(q_i,0,0)=1\) and
With the convention \(\sum _{j=0}^{-1}=0\), we conclude that for \(i=1,2,\)
yielding
By (7.10), (7.11) and (7.12), an immediate consequence is that
which implies
for some suitable constant C.
Next we will prove Statement (S2). Through direct calculation, we obtain
and therefore,
Lastly, by (7.13), (7.14) and (C5) we can conclude that Statement (S3) holds. Therefore, the Fisher information matrix \(\varvec{I}(\varvec{\theta })\) is well defined. Finally, some elementary but tedious calculations show that (C6) is satisfied, too. \(\square \)
Proof of Theorems 4.1
See the proof of Theorem 2 in Freeland and McCabe (2004). \(\square \)
Rights and permissions
About this article
Cite this article
Li, H., Yang, K., Zhao, S. et al. First-order random coefficients integer-valued threshold autoregressive processes. AStA Adv Stat Anal 102, 305–331 (2018). https://doi.org/10.1007/s10182-017-0306-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10182-017-0306-3