Skip to main content

Subsampling for Non-stationary Time Series with Long Memory and Heavy Tails Using Weak Dependence Condition

  • Conference paper
  • First Online:
Cyclostationarity: Theory and Methods III

Part of the book series: Applied Condition Monitoring ((ACM,volume 6))

  • 521 Accesses

Abstract

Statistical inference for unknown distributions of statistics or estimators may be based on asymptotic distributions. Unfortunately, in the case of dependent data the structure of such statistical procedures is often ineffective. In the last three decades we can observe an intensive development the of so-called resampling methods. Using these methods, it is possible to directly approximate the unknown distributions of statistics and estimators. A problem that needs to be solved during the study of the resampling procedures is the consistency. Their consistency for independent or stationary observations has been extensively studied. Resampling for time series with a specific non-stationarity, i.e. the periodic and almost periodic strong mixing dependence structure also been the subject of research. Recent research on resampling methods focus mainly on the time series with the weak dependence structure, defined by Paul Doukhan, Louhichi et al. and simultaneously Bickel and B\(\ddot{u}\)hlmann. In the article a time series model with specific features i.e.: long memory, heavy tails (with at least a fourth moment, e.g.: GED, t-Student), weak dependence and periodic structure is presented. and the estimator of the mean function in the above-mentioned time series is investigated. In the article the necessary central limit theorems and consistency theorems for the mean function estimator (for one of the resampling techniques—the subsampling) are proven.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Andrews, D. W. K. (1984). Non-strong mixing autoregressive processes. Journal of Applied Probability, 21, 930–934.

    Article  MathSciNet  MATH  Google Scholar 

  2. Ango-Nze, P., Dupoiron, S., & Rios, R. (2003). Subsampling under weak dependence condition, Technical Report DT2003-42. CREST

    Google Scholar 

  3. Athreya, K. B. (1987). Bootstrap of the mean in the infinite variance case. The Annals of Statistics, 15, 724–731.

    Article  MathSciNet  MATH  Google Scholar 

  4. Bardet, J.-M., Doukhan, P., Lang, G., & Ragache, N. (2008). Dependent Lindeberg central limit theorem and some applications. ESAIM: Probability and Statistics, 12, 154–172

    Google Scholar 

  5. Beran, J., Feng, T., Ghosh, S., & Kulik, R. (2013). Long-memory processes. Probabilistic properties and statistical methods. New York: Springer.

    Book  MATH  Google Scholar 

  6. Bickel, P., & Bühlmann, P. (1999). A new mixing notion and functional central limit theorems for a sieve bootstrap in time series. Bernoulli, 5–3, 413–446.

    Article  MathSciNet  MATH  Google Scholar 

  7. Brockwell, P., & Davis, R. (1991). Time series: Theory and methods. New York: Springer.

    Book  MATH  Google Scholar 

  8. Dedecker, J., Doukhan, P., Lang, G., León, J. R., Louhichi, S., & Prieur, C. (2008). Weak dependence: With examples and applications. Lecture notes 190 in statistics. Springer

    Google Scholar 

  9. Doukhan, P., & Louhichi, S. (1999). A new weak dependence condition and applications to moment inequalities. Stochastic Processes and their Applications, 84, 313–342

    Google Scholar 

  10. Doukhan, P., Prohl, S., & Robert, C. Y. (2011). Subsampling weakly dependent times series and application to extremes. TEST, 20(3), 487–490.

    Article  MathSciNet  MATH  Google Scholar 

  11. Doukhan, P. (1994). Mixing: Properties and examples. Lecture notes in statistics 85. Springer

    Google Scholar 

  12. Doukhan, P., & Louhichi, S. (1999). A new weak dependence condition and applications to moment inequalities. Stochastic Processes and their Applications, 84, 313–342.

    Article  MathSciNet  MATH  Google Scholar 

  13. Dudek, A. E., Leśkow, J., Politis, D., & Paparoditis, E. (2014). A generalized block bootstrap for seasonal time series. Journal of Time Series Analysis, 35, 89–114.

    Article  MathSciNet  MATH  Google Scholar 

  14. Efron, B. (1979). Bootstrap methods: Another look at the jackknife. The Annals of Statistics, 7, 1–26.

    Article  MathSciNet  MATH  Google Scholar 

  15. Ferrara, L., & Guégan, D. (2001). Forecasting with k-factor Gegenbauer processes: theory and applications. Journal of Forecasting, 20(8), 581–601.

    Article  Google Scholar 

  16. Gajecka-Mirek, E. (2014). Subsampling for weakly dependent and periodically correlated sequences. In Cyclostationarity: Theory and methods. Lecture notes in mechanical engineering. Springer.

    Google Scholar 

  17. Gardner, W., Napolitano, A., & Paura, L. (2006). Cyclostationarity: Half a century of research. Signal Processing, 86

    Google Scholar 

  18. Gladyshev, E. G. (1961). Periodically correlated random sequences. Soviet Mathematics, 2

    Google Scholar 

  19. Gray, H. L., Zhang, N.-F., & Woodward, W. A. (1989). On generalized fractional processes. Journal of Time Series Analysis, 10, 233–257.

    Google Scholar 

  20. Gray, H. L., Zhang, N.-F., & Woodward, W. A. (1994). On generalized fractional processesa correction. Journal of Time Series Analysis, 15(5), 561–562.

    Google Scholar 

  21. Gu\(\acute{e}\)gan, D., Ladoucette S.(2001). Non-mixing properties of long memory sequences. Comptes Rendus de l’ Academie des Sciences Paris, 333, 373–376

    Google Scholar 

  22. Hall, P., Jing, B.-Y., & Lahiri, S. (1998). On the sampling window method for long-range dependent data. Statistica Sinica, 8, 1189–1204.

    MathSciNet  MATH  Google Scholar 

  23. Hosking, J. R. M. (1981). Fractional differencing. Biometrika, 68, 165–176.

    Article  MathSciNet  MATH  Google Scholar 

  24. Hui, Y. V., & Li, W. K. (1995). On Fractionally differenced periodic processes. The Indian Journal of Statistics, 57, Series B, Pt. 1, 19–31.

    Google Scholar 

  25. Hurd, H. L., & Miamee, A. G. (2007). Periodically correlated random sequences: Spectral. Theory and practice. Wiley

    Google Scholar 

  26. Hurd, H., & Leśkow, J. (1992). Strongly consistent and asymptotically normal estimation of the covariance for almost periodically correlated processes. Statistics and Decisions, 10, 201–225.

    MathSciNet  MATH  Google Scholar 

  27. Jach, A., McElroy, T., & Politis, D. N. (2012). Subsampling inference for the mean of heavy-tailed long memory time series. Journal of Time Series Analysis, 33(1), 96–111.

    Article  MathSciNet  MATH  Google Scholar 

  28. Lahiri, S. (1993). On the moving block bootstrap under long-range dependence. Statistics and Probability Letters, 18, 405–413.

    Article  MathSciNet  MATH  Google Scholar 

  29. Leśkow, J., & Synowiecki, R. (2010). On bootstrapping periodic random arrays with increasing period. Metrika, 71, 253–279.

    Article  MathSciNet  MATH  Google Scholar 

  30. Lunetta, G. (1963). Di una generalizzazione dello schema della curva normale. Annali della Facoltá di Economia e Commercio di Palermo, 17, 237–244.

    Google Scholar 

  31. Philippe, A., Surgailis, D., & Viano M. C. (2006). Almost periodically correlated processes with long-memory. Lecture notes in statistics 187. Springer

    Google Scholar 

  32. Politis, D. N., Romano, J. P., & Wolf, M. (1999). Subsampling. New York: Springer.

    Book  MATH  Google Scholar 

  33. Shao, J., & Tu, D. (1995). The jacknife and bootstrap. New York: Springer.

    Book  Google Scholar 

  34. Subbotin, M. T. (1923). On the law of frequency of error. Matematicheskii Sbornik, 31(2), 296–301.

    Google Scholar 

  35. Synowiecki, R. (2007). Consistency and application of moving block bootstrap for non-stationary time series with periodic and almost periodic structure. Bernoulli, 13, 1151–1178.

    Article  MathSciNet  MATH  Google Scholar 

  36. Varanasi, M. K., & Aazhang, B. (1989). Parametric generalized gaussian density estimations. Journal of the Acoustical Society of America, 86(4), 1404–1415.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Elżbieta Gajecka-Mirek .

Editor information

Editors and Affiliations

Appendix

Appendix

In the appendix the proofs of some results are presented.

Proof

Lemma 5

From direct calculations we can obtain the strict formula for the kurtosis of the model \(X_t,\) which is:

$$ \frac{E(X_t-EX_t)^4}{(E(X_t-EX_t)^2)^2}=\frac{E\sigma _t^4 EG_t^4}{(E\sigma _t^2)^2 (EG_t^2)^2}=3\frac{\varGamma ({5/\alpha })\varGamma ({1/\alpha })}{\varGamma ^2({3/\alpha })}. $$

If we use the Stirling’s formula for \(\varGamma \) function we will obtain the approximation as follows:

$$ kurtosis \approx 3\cdot 1.4\cdot 4.3^{1/\alpha }. $$

The kurtosis is more than 3 for all \(\alpha > 0.\)    \(\square \)

Proof

Lemma 6

The mean of \(X_{t}\) is \(\eta _t,\) so it is periodic. The variance is periodic:

$$ \gamma (t+T,0)= (f_{t+T})^{2}(1+\varphi ^2)\gamma _{G}(t+T,0)= $$
$$ =(f_{t+T})^{2}(1+\varphi ^2)\gamma _{G}(0)=(f_{t})^{2}(1+\varphi ^2)\gamma _{G}(t,0)=\gamma (t,0). $$

The autocovariance also:

$$ \gamma (t+T,h)= |f_{t+T}f_{t+T+h}|\varphi ^2\gamma _{G}(t+T,h)= $$
$$ =|f_{t+T}f_{t+T+h}|\varphi ^2\gamma _{G}(h)=|f_{t}f_{t+h}|\varphi ^2\gamma _{G}(t,h)=\gamma (t,h). $$

The form of the variance and autocovariance follows from the form of the variance of variable with the GED distribution.    \(\square \)

Proof

Theorem 5

Let us consider a sequence of statistics \(B_{k_N}(s),\) for fixed \(s=1,2,...,T\) and \(k_N\) as in the Theorem 2.

\(L_{k_N}(s)(x)=P(B_{k_N}(s)\le x)\) is cumulative distribution function of \(B_{k_N}(s)\).

From the assumptions

$$sup_{x\in \mathbb {R}}|L_{k_N}(s)(x)-L(s)(x)|\longrightarrow 0,\ \ k_N\rightarrow \infty $$

For overlapping samples the number of subsamples:

\(Y_{b,q}(s)=(X_{s+qT},X_{s+(q+1)T}...,X_{s+(q+b-1)T}),\) \(q=0,1,...,k_N-b\) and the number of subsampling statistics:

\(B_{k_N,b,q}(s) = \sqrt{b}(\hat{\eta }_{k_N,b,q}(s)-\hat{\eta }_{k_N}(s)) \) is \(k_N-b+1\).

Above statistics are used to approximate the distributions \(L_{k_N}(s)(x)\) by empirical distribution functions: \(L_{k_N,b,q}(s)(x)=\frac{1}{k_N-b+1}\sum _{q=0}^{k_N-b}\mathbb {I}_{\{B_{k_N,b,q}(s)\le x\}}\).

Let us define subsampled distribution:

$$ U_{k_N,b,q}(s)(x)=\frac{1}{k_N-b+1}\sum _{q=0}^{k_N-b}\varphi ({\sqrt{b}(\hat{\eta }_{k_N,b,q}(s)-\eta _{k_N}(s))}{\varepsilon _n}). $$

The sequence \(\varepsilon _n\) is decreasing to zero and \(\varphi \) is the non-increasing continuous function such that \(\varphi = 1\) or 0 according to \(x \le 0\) or \(x \ge 1\) and which is affine between 0 and 1.

From [32] it is known that

$$ \forall x \in \mathbb {R}\ |L_{k_N,b,q}(s)(x)-U_{k_N,b,q}(s)(x)|\mathop {\longrightarrow }\limits ^{p}0. $$

It follows that it is enough to investigate only the variance of \(U_{k_N,b,q}(s),\) \(s=1,...,T\)

It is enough to investigate the variance of \(U_{k_N,b_s,p}(s),\ \ s=1,...,T\) (Theorem 3.2.1 ([32])).

$$ VarU_{k_N,b_s,p}(s)(x)=(k_N-b_s+1)^{-2}(\sum _{|h|<k_N-b_s+1}(k_N-b_s+1-|h|)\gamma (h)) $$

here \(\gamma (h)=Cov(\varphi (B_{k_N,b,p}(s)/\varepsilon _n),\varphi (B_{k_N,b,p+h}(s)/\varepsilon _n)).\)

From the assumption that we have \(\lambda \)-weak dependence and under condition A1 through A4 and the Lemma 3.1 in ([2])

$$ Cov(\varphi (B_{k_N,b,p}(s)/\varepsilon _n),\varphi (B_{k_N,b,p+h}(s)/\varepsilon _n))\le \sqrt{b_s} \lambda _{h-b+1}/\varepsilon _n $$

It implies that \(Var(U_{k_N,b_s,p}(s)(x))\) tends to zero, it proves point 1. of the Theorem 5.

The proof of point 2. and 3. if the 1. holds and under assumption of the model (6) is the same as the proof of 3. in the Theorem 3.2.1 ([32]).    \(\square \)

Proof

Theorem 6

For any vector of constants \(c \in \mathbb {R}^T\) we have the equation for the subsampling version of the characteristic functions of the distributions:

$$ \phi ^*_{B_{k_N,b,q}}(c) = \phi ^*_{c^T B_{k_N,b,q}} (1) \ \ \mathrm{in \ GED \ case} $$

Let \(Z_{s+pT} = c_sX_{s+pT},\) where \(p = 0, \ldots ,k_N - 1\) and \(s = 1, \ldots , T.\) The series \(\{Z_t\}\) fulfills the assumptions of Theorem 6, which means that subsampling is consistent for the mean \((\eta _N)_Z.\) By Theorem A in Athreya [3] we have:

in GED case

$$ \phi ^*_{c^T B_{k_N,b,q}} (1)\mathop {\rightarrow }\limits ^{p} \phi _{k_N(\eta ,c^T\varSigma c)} (1)=\phi _{k_N(\eta ,\varSigma )} (c). $$

Moreover

$$ P^* (B_{k_N,b,q} \le x)\mathop {\rightarrow }\limits ^{p} F_{k_N(\eta ,\varSigma )}(x), $$

for any \(x \in \mathbb {R}^T,\) where \(F_{k_N(\eta ,\varSigma )}(x)\) is the cumulative distribution function of \(k_N (\eta , \varSigma ).\)

The second point of the thesis of the Theorem 6 follows then from P\(\acute{o}\)lya’s theorem ([33], p. 447).    \(\square \)

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Gajecka-Mirek, E., Knapik, O. (2017). Subsampling for Non-stationary Time Series with Long Memory and Heavy Tails Using Weak Dependence Condition. In: Chaari, F., Leskow, J., Napolitano, A., Zimroz, R., Wylomanska, A. (eds) Cyclostationarity: Theory and Methods III. Applied Condition Monitoring, vol 6. Springer, Cham. https://doi.org/10.1007/978-3-319-51445-1_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-51445-1_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-51444-4

  • Online ISBN: 978-3-319-51445-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics