Skip to main content
Log in

Inference for continuous-time long memory randomly sampled processes

  • Regular Article
  • Published:
Statistical Papers Aims and scope Submit manuscript

Abstract

From a continuous-time long memory stochastic process, a discrete-time randomly sampled one is drawn using a renewal sampling process. We establish the existence of the spectral density of the sampled process, and we give its expression in terms of that of the initial process. We also investigate different aspects of the statistical inference on the sampled process. In particular, we obtain asymptotic results for the periodogram, the local Whittle estimator of the memory parameter and the long run variance of partial sums. We mainly focus on Gaussian continuous-time process. The challenge being that the randomly sampled process will no longer be jointly Gaussian.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Abadir KM, Distaso W, Giraitis L (2009) Two estimators of the long-run variance: beyond short memory. J Econom 150(1):56–70

    Article  MathSciNet  Google Scholar 

  • Axler S, Bourdon P, Ramey W (2000) Harmonic function theory. Graduate texts in mathematics, vol 137. Springer, New York

    Google Scholar 

  • Bardet J-M, Bertrand P (2010) A non-parametric estimator of the spectral density of a continuous-time Gaussian process observed at random times. Scand J Stat 37(3):458–476

    Article  MathSciNet  Google Scholar 

  • Beran J, Feng Y, Ghosh S, Kulik R (2013) Long-memory processes. Probabilistic properties and statistical methods. Springer, Heidelberg

    Book  Google Scholar 

  • Brillinger DR (1969) The calculation of cumulants via conditioning. Ann Inst Stat Math 21(1):215–218

    Article  MathSciNet  Google Scholar 

  • Chambers MJ (1996) The estimation of continuous parameter long-memory time series models. Econom Theor 12(2):374–390

    Article  MathSciNet  Google Scholar 

  • Comte F (1996) Simulation and estimation of long memory continuous time models. J Time Ser Anal 17(1):19–36

    Article  MathSciNet  Google Scholar 

  • Dacorogna MM (2001) An introduction to high-frequency finance. Academic Press, San Diego

    Google Scholar 

  • Dalla V, Giraitis L, Hidalgo J (2006) Consistent estimation of the memory parameter for nonlinear time series. J Time Ser Anal 27(2):211–251

    Article  MathSciNet  Google Scholar 

  • Doukhan P, Oppenheim G, Taqqu M (2003) Theory and applications of long-range dependence. Birkhäuser, Boston

    Google Scholar 

  • Giraitis L, Leipus R, Philippe A (2006) A test for stationarity versus trends and unit roots for a wide class of dependent errors. Econom Theory 22:989–1029

    Article  MathSciNet  Google Scholar 

  • Giraitis L, Koul HL, Surgailis D (2012) Large sample inference for long memory processes. Imperial College Press, London

    Book  Google Scholar 

  • Gradshteyn IS, Ryzhik IM (2015) Table of integrals, series, and products, 8th edn. Elsevier/Academic Press, Amsterdam

    Google Scholar 

  • Hurvich CM, Beltrão KI (1993) Asymptotics for the low-frequency ordinates of the periodogram of a long-memory time series. J Time Ser Anal 14(5):455–472

    Article  MathSciNet  Google Scholar 

  • Jones RH (1985) Time series analysis with unequally spaced data. Handb Stat 5:157–177

    Article  MathSciNet  Google Scholar 

  • Leonenko N, Olenko A (2013) Tauberian and Abelian theorems for long-range dependent random fields. Methodol Comput Appl Probab 15(4):715–742

    Article  MathSciNet  Google Scholar 

  • Li Z (2014) Methods for irregularly sampled continuous time processes. PhD Thesis, University College of London

  • Philippe A, Viano M-C (2010) Random sampling of long-memory stationary processes. J Stat Plan Inference 140(5):1110–1124

    Article  MathSciNet  Google Scholar 

  • Philippe A, Robet C, Viano M-C (2021) Random discretization of stationary continuous time processes. Metrika 84(3):375–400

    Article  MathSciNet  Google Scholar 

  • Pólya G (1949) Remarks on characteristic functions. In: Proceedings of the first Berkeley conference on mathematical statistics and probability. pp 115–123

  • Thiebaut C, Roques S (2005) Time-scale and time-frequency analyses of irregularly sampled astronomical time series. EURASIP J Adv Signal Process 2005(15):852587

    Article  Google Scholar 

  • Tsai H, Chan KS (2005a) Maximum likelihood estimation of linear continuous time long memory processes with discrete time data. J R Stat Soc Ser B Stat Methodol 67(5):703–716

  • Tsai H, Chan KS (2005b) Quasi-maximum likelihood estimation for a class of continuous-time long-memory processes. J Time Ser Anal 26(5):691–713

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anne Philippe.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix: Proof of Lemma 2

Appendix: Proof of Lemma 2

Proof

The proof is essentially based on Corollary 2 and a well known cumulant formula.

Without loss of generality, we can assume that the Poisson rate is 1. The process Y is 4th order stationary as the conditional joint distribution of \((Y_k,Y_{k+h},Y_{k+r},Y_{k+s})\) given \((T_1,\dots ,T_{k+\max (h,r,s)})\) is a multivariate normal with variance-covariance matrix \(M(T_k,T_{k+h},T_{k+r},T_{k+s})\) given by

$$\begin{aligned}{} & {} M(T_k,T_{k+h},T_{k+r},T_{k+s}) \nonumber \\{} & {} \quad :=\left( {\begin{matrix} \sigma _X(0)&{}\sigma _X(T_{k+h}-T_k)&{}\sigma _X(T_{k+r}-T_k)&{}\sigma _X(T_{k+s}-T_k)\\ \sigma _X(T_{k+h}-T_k)&{}\sigma _X(0)&{}\sigma _X(T_{k+r}-T_{k+h})&{}\sigma _X(T_{k+s}-T_{k+h})\\ \sigma _X(T_{k+r}-T_k)&{}\sigma _X(T_{k+r}-T_{k+h})&{}\sigma _X(0)&{}\sigma _X(T_{k+s}-T_{k+r})\\ \sigma _X(T_{k+s}-T_k)&{}\sigma _X(T_{k+s}-T_{k+h})&{}\sigma _X(T_{k+s}-T_{k+r})&{}\sigma _X(0) \end{matrix}}\right) \nonumber \\ \end{aligned}$$
(41)

which is k free. Hence it is enough to establish the lemma when \(k=0\). We apply the total law of cumulance formula (Brillinger 1969), which for the sake of clarity, we remind here: for all random vectors \(Z=(Z_1,\ldots ,Z_n)'\) and W, we have

$$\begin{aligned} \textrm{cum}(Z)=\sum _{\pi }\textrm{cum}\left[ \textrm{cum}(X_{\pi _1}\vert W),\ldots ,\textrm{cum}(X_{\pi _b}\vert W)\right] \end{aligned}$$
(42)

where \(X_{\pi _j}=(X_i,i\in \pi _j)\), and \(\pi _1,\ldots ,\pi _b\), (\(b=1,\ldots ,n\)) are the blocks of the permutation \(\pi \), and the sum is over all permutations \(\pi \) of the set \(\{1,2,\ldots ,n\}\).

But condition on T, the process \(Y_t\) is jointly zero-mean Gaussian and therefore \(\mathbb {E}(Y_t\vert T)=0\) as well as \(\textrm{cum}(Y_i,Y_j,Y_k,Y_\ell \vert T)=\textrm{cum}(Y_i,Y_j,Y_k\vert T)=0\) for all \(i,j,k,\ell \). Hence applying (42) to \(Y_t\) with \(W=T\), only the two-by-two partitions of \(\{0,h,r,s\}\) will survive. and since \(\textrm{cum}(U,V)=\text {Cov}(U,V)\), we get from (41)

$$\begin{aligned} \textrm{cum}(Y_0,Y_{h},Y_{r},Y_{s})=\,&\textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s))+\textrm{Cov}(\sigma _X(T_r),\sigma _X(T_h-T_s))\nonumber \\&+\textrm{Cov}(\sigma _X(T_s),\sigma _X(T_r-T_h)). \end{aligned}$$
(43)

Note that for \( h< \min (r,s)\), \(\textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s))=0\). Moreover

$$\begin{aligned}&\sum _{1\le r \le h \le s \le n} |\textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s)) |\\&\quad \le \sum _{1\le r \le h \le s \le n} \textrm{Var}(\sigma _X(T_h))^{1/2} \textrm{Var}( \sigma _X(T_s-T_r))^{1/2}\\&\quad \le \sum _{1\le r \le h \le s \le n} h^{-\alpha /2} (1 +s-r)^{-\alpha /2} \\&\quad \le h^{-\alpha /2} \sum _{1\le r \le h } \sum _{t=1}^n t ^{-\alpha /2} \\&\quad \le h^{1-\alpha /2} {\left\{ \begin{array}{ll} n^{1-\alpha /2} = n^{2d -1/2 }&{} \text {if d<1/4} \\ \log (n) &{} \text {if}\,\, d\ge 1/4 \\ \end{array}\right. }\\&\le {\left\{ \begin{array}{ll} n^{4d -1 }&{} \text {if} \,\, d<1/4 \\ \log (n) &{} \text {if}\,\, d\ge 1/4 \\ \end{array}\right. }\\&\le Cn^{2d} \; \text { for all } \; 0< d <1/2. \end{aligned}$$

The last configuration is

$$\begin{aligned} \sum _{r,s=1}^h | \textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s))|&= \sum _{r,s=1}^h | \textrm{Var}(\sigma _X(T_h))^{1/2} \textrm{Var}( \sigma _X(T_s-T_r))^{1/2}\\&\le h^{-\alpha /2} \sum _{t=1}^h (h-t) t^{-\alpha /2} \\&\le h^{-\alpha /2 } {\left\{ \begin{array}{ll} C h^{1-\alpha /2} = Ch^{2d -1/2 }&{} \text {if} \,\,d<1/4 \\ \log (h) &{} \text {if}\,\, d\ge 1/4 \\ \end{array}\right. } \\&\le Cn^{2d} \; \text { for all } \; 0< d <1/2. \end{aligned}$$

Therefore uniformly in h we have

$$\begin{aligned} \sum _{r,s=1}^n | \textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s))| \le Cn^{2d}. \end{aligned}$$

For the remaining two terms in the right hand side of (43) we have, for fixed h,

$$\begin{aligned} \sum _{r,s=1}^n | \textrm{Cov}(\sigma _X(T_r),\sigma _X(T_h-T_s))|&= \sum _{r,s=1}^n | \textrm{Cov}(\sigma _X(T_s),\sigma _X(T_h-T_r))| \\&\le \sum _{r,s=1}^n \textrm{Var}(\sigma _X(T_s))^{1/2} \textrm{Var}( \sigma _X(T_h-T_r))^{1/2}\\&\le \sum _{r,s=1}^n s^{-\alpha /2} (1+|h-r|)^{-\alpha /2} \\&\le C {\left\{ \begin{array}{ll} n^{2-\alpha } = n^{4d -1 }&{} \text {if}\,\, d<1/4 \\ \log (n)^2 &{} \text {if}\,\, d\ge 1/4 \\ \end{array}\right. } \\ {}&\le Cn^{2d} \; \text { for all } \; 0< d <1/2. \end{aligned}$$

This concludes the proof of (37).

Let us now prove (38). Note that

$$\begin{aligned} \sum _{h,r,s=0}^n \textrm{cum}(Y_0,Y_{h},Y_{r},Y_{s})&= 3 \sum _{h,r,s=1}^n \textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s)) \nonumber \\&= 6 \sum _{h=1}^n\sum _{r<s=1}^n \textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s)). \end{aligned}$$
(44)

Moreover, we have

$$\begin{aligned} \sum _{h,r,s=1}^n |\textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s)) |&\le C \sum _{h,r,s=1}^n h^{-\alpha /2} (1+|r-s|)^{-\alpha /2} \\&\le C \sum _{h=1}^n h^{-\alpha /2} \sum _{t=1}^n (n-t) t^{-\alpha /2} \\&\le C {\left\{ \begin{array}{ll} n n^{2-\alpha } = n^{4d }&{} \text {if}\,\, d<1/4 \\ n \log (n)^2 &{} \text {if}\,\, d> 1/4 \\ \end{array}\right. } \\&\le Cn^{4d} \end{aligned}$$

In the particular case \(d=1/4\) (where we still have \(\alpha =2\)), a supplementary term \(\log (n)\) is needed in the bound. Indeed we split the sum in the right hand side of (44) into 3 configurations. when \(1 \le h \le r< s \le n\) the covariance \( \textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s))\) is zero. When the sum is over \(1 \le r< h \le s \le n\), we get

$$\begin{aligned} \sum _{1 \le r< h \le s \le n} |\textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s)) |&\le C \sum _{s=1}^n \sum _{h=1}^s h^{-1 } \sum _{r=1}^{h-1} (s-h + h-r)^{-1} \\&\sim C \sum _{s=1}^n \sum _{h=1}^s h^{-1 } \left( \log (s) - \log (s-h) \right) \\&= -C \sum _{s=1}^n \sum _{h=1}^s (h/s)^{-1 } \log (1-h/s) (1/s) \\&\sim -C \sum _{s=1}^n\left( \int _{0}^1 \frac{ \log (1-x)}{x} \, dx\right) = C\frac{\pi ^2}{6} n. \end{aligned}$$

For the last sum over \(1 \le r < s \le h \le n\) (where we will need the \(\log \) term) we have

$$\begin{aligned} \sum _{1 \le r <s \le h \le n} |\textrm{Cov}(\sigma _X(T_h),\sigma _X(T_r-T_s)) |&\le C \sum _{h=1}^n h^{-1 } \sum _{s=1}^h \sum _{r=1}^{s-1} (s-r)^{-1} \\&= \sum _{h=1}^n h^{-1 } \sum _{t=1}^h (h-t) t^{-1} \\&= \sum _{h=1}^n \sum _{t=1}^h (1-t/h) t^{-1} \\&\sim C \sum _{h=1}^n ( \log (h) -1 ) \sim C n \log (n). \end{aligned}$$

This completes the proof of (38) in Lemma 2. \(\square \)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ould Haye, M., Philippe, A. & Robet, C. Inference for continuous-time long memory randomly sampled processes. Stat Papers (2023). https://doi.org/10.1007/s00362-023-01515-z

Download citation

  • Received:

  • Revised:

  • Published:

  • DOI: https://doi.org/10.1007/s00362-023-01515-z

Keywords

Navigation