Abstract
In this article, we describe an estimating function (EF) approach for circular time series models. We construct EFs based on conditional trigonometric moments of the circular discrete-time stochastic processes and provide closed form expressions for the optimal EF of the model parameters and its associated Godambe information. When the conditional circular mean and concentration are functions of the same parameters of interest, we show that the combined EF is more informative than its component sine and cosine EFs. We discuss recursive estimation of circular model parameters and illustrate the approach on two well known circular time series models.
Similar content being viewed by others
References
Artes, R., Paula, G.A. and Ranvaud, R. (2000). Analysis of circular longitudinal data based on generalized estimating equations. Aust. N. Z. J. Stat. 42, 347–358.
Downs, T.D. and Mardia, K.V. (2002). Circular regression. Biometrika89, 683–698.
Fisher, N.I. and Lee, A.J. (1992). Regression models for an angular response. Biometrics 48, 665–677.
Fisher, N.I. (1993). Statistical Analysis of Circular Data. Cambridge University Press, Cambridge.
Fisher, N.I. and Lee, A.J. (1994). Time series of circular data. J. R. Stat. Soc. Series B 56, 327–339.
Gatto, R. and Jammalamadaka, S.R. (2003). Inference for wrapped symmetric α-stable circular models. Sankhyā 65, 333–355.
Godambe, V.P. (1985). The foundations of finite sample estimation in stochastic process. Biometrika 72, 319–328.
Holzmann, H., Munk, A., Suster, M. and Zucchini, W. (2006). Hidden Markov models for circular and linear-circular time series. Environ. Ecol. Stat. 13, 325–347.
Hughes, G. (2007). Multivariate and time series models for circular data with applications to protein conformational angles, PhD thesis University of Leeds, Leeds, UK.
Mardia, K.V. (1975a). Statistics of directional data. J. R. Stat. Soc. Series B 3, 349–393.
Mardia, K.V. (1975b). Characterizations of directional distributions. Reidel, Dordrecht, Patil, G. P., Kotz, S. and Ord, J. K. (eds.), p. 365–385.
Mardia, K.V. and Jupp, P.E. (2000). Directional Statistics. Wiley, Chichester.
Mardia, K.V., Hughes, G., Taylor, C.C. and Singh, H. (2008). A multivariate von Mises distribution with applications to bioinformatics. Can. J. Stat. 36, 99–109.
Nadarajah, S. and Zhang, Y. (2017). Wrapped: an R package for circular data. PLoS ONE 12, e0188512. https://doi.org/10.1371/journal.pone.0188512.
Pewsey, A., Neuhauser, M. and Ruxton, G. (2013). Circular Statistics in R. Oxford University Press, Oxford.
Rao, C.R. (1973). Linear Statistical Inferentce and its Applications, 2nd edn. New York, Wiley.
Rivest, S. (1988). Circular statistics in R. Oxford University Press, Oxford.
Samorodnitsky, G. and Taqqu, M.S. (1994). Stable Non-Gaussian Random Processes: Stochastic Models with Infinite Variance. Chapman & Hall, New York.
Singh, H., Hnizdo, V. and Demchuk, E. (2002). Probabilistic model for two dependent circular variables. Biometrika 89, 719–723.
Stienne, G., Reboul, S., Azmani, M., Choquel, J.B. and Benjelloun, M.A. (2014). Multi-temporal multi-sensor circular fusion filter. Inf. Fusion 18, 86–100.
Thavaneswaran, A., Ravishanker, N. and Liang, Y. (2013). Inference for linear and nonlinear stable error processes via estimating functions. J. Stat. Plan. Inference 143, 827–841.
Thavaneswaran, A., Ravishanker, N. and Liang, Y. (2015). Generalized duration models and optimal estimation using estimating functions. Ann. Inst. Stat. Math. 67, 129–156.
Acknowledgements
The authors are grateful to referees for their suggestions and to Prof. N. Balakrishnan for discussions that helped to considerably improve the paper. The first author acknowledges support from an NSERC grant.
Author information
Authors and Affiliations
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A: The Gobambe EF Approach for Stochastic Processes
Appendix A: The Gobambe EF Approach for Stochastic Processes
Let \(\mathbf {y}_{n}=(y_{1},{\ldots } , y_{n})^{\prime }\) and \(({\Omega }, \mathcal {F}, P_{\boldsymbol {\theta }})\) denote the underlying probability space, and let \(\mathcal {F}_{t}\) be the σ-field generated by {y1,…,yt, t ≥ 1}. Let ht(yt, 𝜃),1 ≤ t ≤ n be specified q-dimensional martingale differences (MDs). Let \({\mathscr{M}}\) denote the class of zero mean, square integrable k-dimensional martingale EFs of the form
where at− 1(𝜃) are k × q matrices depending on 𝜃 and on yt− 1, 1 ≤ t ≤ n. It is assumed that the EFs g(yn, 𝜃) are almost surely differentiable with respect to the components of 𝜃, \(E \!\left (\left .\!\! \frac {\partial \mathbf {g}(\mathbf {y}_{n},\boldsymbol {\theta })}{\partial \boldsymbol {\theta }} \!\right |\mathcal {F}_{n-1} \!\!\right )\) and that \(E (\mathbf {g}(\mathbf {y}_{n},\boldsymbol {\theta }) \mathbf {g} (\mathbf {y}_{n},\boldsymbol {\theta })^{\prime }|\mathcal {F}_{n-1}) \) is positive definite (p.d.) for all 𝜃 and for each n ≥ 1. The expectations are always taken with respect to P𝜃. Estimators of 𝜃 can be obtained by solving the EE g(yn, 𝜃) = 0. To simplify notation in the following equations, we use g and ht to denote g(yn, 𝜃) and ht(yt, 𝜃) respectively.
Theorem 2.
The optimality result for EFs states that in the class of all zero mean, square integrable martingale EFs \({\mathscr{M}}\), the optimal EF g∗ maximizes, in the partial order of nonnegative definite matrices, the information matrix
The optimal EF and corresponding optimal information are given by
From Eq. Appendix.2, the recursive estimate of 𝜃 becomes
for t = 1,…,n, where Ik is the k-dimensional identity matrix. When 𝜃 is a scalar parameter, its recursive estimate is given by
where
\(A_{t} = a^{\ast }_{t-1} (\widehat {\theta }_{t-1}) \frac {\partial h_{t} (\widehat {\theta }_{t-1})}{\partial \theta } +\frac {\partial a^{\ast }_{t-1} (\widehat {\theta }_{t-1})}{\partial \theta } h_{t} (\widehat {\theta }_{t-1})\).
When the conditional mean and variance are functions of the same parameter, the following Lemma 1 gives the form of the combined EF based on two non-orthogonal martingale differences, which is more informative than each of the component EFs; see Thavaneswaran et al. (2015) for more details. Consider a discrete time stochastic process {yt, t = 1,2,…} with first two conditional moments given by
To estimate the parameter 𝜃 based on the observations \(\mathbf {y}{_{n}}=(y_{1}, \ldots , y_{n})^{\prime }\), consider two martingale differences (MDs) for t = 1,…,n, i.e., mt(𝜃) = yt − μt(𝜃) and \(s_{t} (\boldsymbol \theta ) = {m_{t}^{2}} (\boldsymbol \theta ) - {\sigma _{t}^{2}} (\boldsymbol \theta )\)), with quadratic variations and covariation given by 〈m〉t,〈s〉t and 〈m, s〉t.
Lemma 1.
In the class of all combined EFs of the form \(\mathcal {G}_{C} =\{\mathbf {g}_{C} (\boldsymbol \theta ): \mathbf {g}_{C} (\boldsymbol \theta ) = {\sum }_{t=1}^{n} \left (\mathbf {a}_{t-1} m_{t} + \mathbf {b}_{t-1} s_{t}\right )\}\),
(a) the optimal EF is given by \(\mathbf {g}_{C}^{\ast } (\boldsymbol \theta ) = {\sum }_{t=1}^{n} \left (\mathbf {a}_{t-1}^{*} m_{t} + \mathbf {b}_{t-1}^{*} s_{t}\right )\), where
(b) the information \(\mathbf {I}_{\mathbf {g}_{C}^{\ast }} (\boldsymbol \theta )\) is given by
The proof of this lemma is similar to the proof for combining linear and quadratic EFs shown in Thavaneswaran et al. (2015) and is omitted here.
Rights and permissions
About this article
Cite this article
Thavaneswaran, A., Ravishanker, N. Estimating Functions for Circular Time Series Models. Sankhya A 85, 198–213 (2023). https://doi.org/10.1007/s13171-020-00237-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13171-020-00237-w