Abstract
Statistical inference for unknown distributions of statistics or estimators may be based on asymptotic distributions. Unfortunately, in the case of dependent data the structure of such statistical procedures is often ineffective. In the last three decades we can observe an intensive development the of so-called resampling methods. Using these methods, it is possible to directly approximate the unknown distributions of statistics and estimators. A problem that needs to be solved during the study of the resampling procedures is the consistency. Their consistency for independent or stationary observations has been extensively studied. Resampling for time series with a specific non-stationarity, i.e. the periodic and almost periodic strong mixing dependence structure also been the subject of research. Recent research on resampling methods focus mainly on the time series with the weak dependence structure, defined by Paul Doukhan, Louhichi et al. and simultaneously Bickel and B\(\ddot{u}\)hlmann. In the article a time series model with specific features i.e.: long memory, heavy tails (with at least a fourth moment, e.g.: GED, t-Student), weak dependence and periodic structure is presented. and the estimator of the mean function in the above-mentioned time series is investigated. In the article the necessary central limit theorems and consistency theorems for the mean function estimator (for one of the resampling techniques—the subsampling) are proven.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Andrews, D. W. K. (1984). Non-strong mixing autoregressive processes. Journal of Applied Probability, 21, 930–934.
Ango-Nze, P., Dupoiron, S., & Rios, R. (2003). Subsampling under weak dependence condition, Technical Report DT2003-42. CREST
Athreya, K. B. (1987). Bootstrap of the mean in the infinite variance case. The Annals of Statistics, 15, 724–731.
Bardet, J.-M., Doukhan, P., Lang, G., & Ragache, N. (2008). Dependent Lindeberg central limit theorem and some applications. ESAIM: Probability and Statistics, 12, 154–172
Beran, J., Feng, T., Ghosh, S., & Kulik, R. (2013). Long-memory processes. Probabilistic properties and statistical methods. New York: Springer.
Bickel, P., & Bühlmann, P. (1999). A new mixing notion and functional central limit theorems for a sieve bootstrap in time series. Bernoulli, 5–3, 413–446.
Brockwell, P., & Davis, R. (1991). Time series: Theory and methods. New York: Springer.
Dedecker, J., Doukhan, P., Lang, G., León, J. R., Louhichi, S., & Prieur, C. (2008). Weak dependence: With examples and applications. Lecture notes 190 in statistics. Springer
Doukhan, P., & Louhichi, S. (1999). A new weak dependence condition and applications to moment inequalities. Stochastic Processes and their Applications, 84, 313–342
Doukhan, P., Prohl, S., & Robert, C. Y. (2011). Subsampling weakly dependent times series and application to extremes. TEST, 20(3), 487–490.
Doukhan, P. (1994). Mixing: Properties and examples. Lecture notes in statistics 85. Springer
Doukhan, P., & Louhichi, S. (1999). A new weak dependence condition and applications to moment inequalities. Stochastic Processes and their Applications, 84, 313–342.
Dudek, A. E., Leśkow, J., Politis, D., & Paparoditis, E. (2014). A generalized block bootstrap for seasonal time series. Journal of Time Series Analysis, 35, 89–114.
Efron, B. (1979). Bootstrap methods: Another look at the jackknife. The Annals of Statistics, 7, 1–26.
Ferrara, L., & Guégan, D. (2001). Forecasting with k-factor Gegenbauer processes: theory and applications. Journal of Forecasting, 20(8), 581–601.
Gajecka-Mirek, E. (2014). Subsampling for weakly dependent and periodically correlated sequences. In Cyclostationarity: Theory and methods. Lecture notes in mechanical engineering. Springer.
Gardner, W., Napolitano, A., & Paura, L. (2006). Cyclostationarity: Half a century of research. Signal Processing, 86
Gladyshev, E. G. (1961). Periodically correlated random sequences. Soviet Mathematics, 2
Gray, H. L., Zhang, N.-F., & Woodward, W. A. (1989). On generalized fractional processes. Journal of Time Series Analysis, 10, 233–257.
Gray, H. L., Zhang, N.-F., & Woodward, W. A. (1994). On generalized fractional processesa correction. Journal of Time Series Analysis, 15(5), 561–562.
Gu\(\acute{e}\)gan, D., Ladoucette S.(2001). Non-mixing properties of long memory sequences. Comptes Rendus de l’ Academie des Sciences Paris, 333, 373–376
Hall, P., Jing, B.-Y., & Lahiri, S. (1998). On the sampling window method for long-range dependent data. Statistica Sinica, 8, 1189–1204.
Hosking, J. R. M. (1981). Fractional differencing. Biometrika, 68, 165–176.
Hui, Y. V., & Li, W. K. (1995). On Fractionally differenced periodic processes. The Indian Journal of Statistics, 57, Series B, Pt. 1, 19–31.
Hurd, H. L., & Miamee, A. G. (2007). Periodically correlated random sequences: Spectral. Theory and practice. Wiley
Hurd, H., & Leśkow, J. (1992). Strongly consistent and asymptotically normal estimation of the covariance for almost periodically correlated processes. Statistics and Decisions, 10, 201–225.
Jach, A., McElroy, T., & Politis, D. N. (2012). Subsampling inference for the mean of heavy-tailed long memory time series. Journal of Time Series Analysis, 33(1), 96–111.
Lahiri, S. (1993). On the moving block bootstrap under long-range dependence. Statistics and Probability Letters, 18, 405–413.
Leśkow, J., & Synowiecki, R. (2010). On bootstrapping periodic random arrays with increasing period. Metrika, 71, 253–279.
Lunetta, G. (1963). Di una generalizzazione dello schema della curva normale. Annali della Facoltá di Economia e Commercio di Palermo, 17, 237–244.
Philippe, A., Surgailis, D., & Viano M. C. (2006). Almost periodically correlated processes with long-memory. Lecture notes in statistics 187. Springer
Politis, D. N., Romano, J. P., & Wolf, M. (1999). Subsampling. New York: Springer.
Shao, J., & Tu, D. (1995). The jacknife and bootstrap. New York: Springer.
Subbotin, M. T. (1923). On the law of frequency of error. Matematicheskii Sbornik, 31(2), 296–301.
Synowiecki, R. (2007). Consistency and application of moving block bootstrap for non-stationary time series with periodic and almost periodic structure. Bernoulli, 13, 1151–1178.
Varanasi, M. K., & Aazhang, B. (1989). Parametric generalized gaussian density estimations. Journal of the Acoustical Society of America, 86(4), 1404–1415.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
In the appendix the proofs of some results are presented.
Proof
Lemma 5
From direct calculations we can obtain the strict formula for the kurtosis of the model \(X_t,\) which is:
If we use the Stirling’s formula for \(\varGamma \) function we will obtain the approximation as follows:
The kurtosis is more than 3 for all \(\alpha > 0.\) \(\square \)
Proof
Lemma 6
The mean of \(X_{t}\) is \(\eta _t,\) so it is periodic. The variance is periodic:
The autocovariance also:
The form of the variance and autocovariance follows from the form of the variance of variable with the GED distribution. \(\square \)
Proof
Theorem 5
Let us consider a sequence of statistics \(B_{k_N}(s),\) for fixed \(s=1,2,...,T\) and \(k_N\) as in the Theorem 2.
\(L_{k_N}(s)(x)=P(B_{k_N}(s)\le x)\) is cumulative distribution function of \(B_{k_N}(s)\).
From the assumptions
For overlapping samples the number of subsamples:
\(Y_{b,q}(s)=(X_{s+qT},X_{s+(q+1)T}...,X_{s+(q+b-1)T}),\) \(q=0,1,...,k_N-b\) and the number of subsampling statistics:
\(B_{k_N,b,q}(s) = \sqrt{b}(\hat{\eta }_{k_N,b,q}(s)-\hat{\eta }_{k_N}(s)) \) is \(k_N-b+1\).
Above statistics are used to approximate the distributions \(L_{k_N}(s)(x)\) by empirical distribution functions: \(L_{k_N,b,q}(s)(x)=\frac{1}{k_N-b+1}\sum _{q=0}^{k_N-b}\mathbb {I}_{\{B_{k_N,b,q}(s)\le x\}}\).
Let us define subsampled distribution:
The sequence \(\varepsilon _n\) is decreasing to zero and \(\varphi \) is the non-increasing continuous function such that \(\varphi = 1\) or 0 according to \(x \le 0\) or \(x \ge 1\) and which is affine between 0 and 1.
From [32] it is known that
It follows that it is enough to investigate only the variance of \(U_{k_N,b,q}(s),\) \(s=1,...,T\)
It is enough to investigate the variance of \(U_{k_N,b_s,p}(s),\ \ s=1,...,T\) (Theorem 3.2.1 ([32])).
here \(\gamma (h)=Cov(\varphi (B_{k_N,b,p}(s)/\varepsilon _n),\varphi (B_{k_N,b,p+h}(s)/\varepsilon _n)).\)
From the assumption that we have \(\lambda \)-weak dependence and under condition A1 through A4 and the Lemma 3.1 in ([2])
It implies that \(Var(U_{k_N,b_s,p}(s)(x))\) tends to zero, it proves point 1. of the Theorem 5.
The proof of point 2. and 3. if the 1. holds and under assumption of the model (6) is the same as the proof of 3. in the Theorem 3.2.1 ([32]). \(\square \)
Proof
Theorem 6
For any vector of constants \(c \in \mathbb {R}^T\) we have the equation for the subsampling version of the characteristic functions of the distributions:
Let \(Z_{s+pT} = c_sX_{s+pT},\) where \(p = 0, \ldots ,k_N - 1\) and \(s = 1, \ldots , T.\) The series \(\{Z_t\}\) fulfills the assumptions of Theorem 6, which means that subsampling is consistent for the mean \((\eta _N)_Z.\) By Theorem A in Athreya [3] we have:
in GED case
Moreover
for any \(x \in \mathbb {R}^T,\) where \(F_{k_N(\eta ,\varSigma )}(x)\) is the cumulative distribution function of \(k_N (\eta , \varSigma ).\)
The second point of the thesis of the Theorem 6 follows then from P\(\acute{o}\)lya’s theorem ([33], p. 447). \(\square \)
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Gajecka-Mirek, E., Knapik, O. (2017). Subsampling for Non-stationary Time Series with Long Memory and Heavy Tails Using Weak Dependence Condition. In: Chaari, F., Leskow, J., Napolitano, A., Zimroz, R., Wylomanska, A. (eds) Cyclostationarity: Theory and Methods III. Applied Condition Monitoring, vol 6. Springer, Cham. https://doi.org/10.1007/978-3-319-51445-1_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-51445-1_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-51444-4
Online ISBN: 978-3-319-51445-1
eBook Packages: EngineeringEngineering (R0)