Abstract
We present new results on estimation of the periodic mean function of the periodically correlated time series that exhibits heavy tails and long memory under a weak dependence condition. In our model that is a generalization of the work of McElroy and Politis [35, 42] we show that the estimator of the periodic mean function has an asymptotic distribution that depends on the degree of heavy tails and the degree of the long memory. Such an asymptotic distribution clearly poses a problem while trying to build the confidence intervals. Thus the main point of this research is to establish the consistency of one of the resampling methods - the subsampling procedure - in the considered model. We obtain such consistency under relatively mild conditions on time series at hand. The selection of the block length plays an important role in the resmapling methodology. In the article we discuss as well one of the possible ways of selection the length the subsampling window. We illustrate our results with simulated data as well as with real data set corresponding to Nord Spool data. For such data we consider practical issues of constructing the confidence band for the periodic mean function and the choice of the subsampling window.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Arcones M (1994) Limit theorems for nonlinear functionals of a stationary sequence of vectors. Ann Prob 22:2242–2274
Bardet J-M, Doukhan P, Lang G, Ragache N (2008) Dependent Lindeberg central limit theorem and some applications. ESAIM: Probab Stat 12:154–172
Beran J, Feng T, Ghosh S, Kulik R (2013) Long-memory processes. Probabilistic Properties and Statistical Methods. Springer, Heidelberg
Bertail P (2011) Comments on “Subsampling weakly dependent time series and application to extremes”. TEST 20(3):487–490
Bertail P, Haefke C, Politis DN, White W (2004) Subsampling the distribution of diverging statistics with applications to finance. J Econ 120:295–326
Bickel P, Bühlmann P.: A new mixing notion and functional central limit theorems for a sieve bootstrap in time series, Bernoulli, 5-3, pp. 413–446 (1999)
Bradley R (2005) Basic properties of strong mixing conditions. Probab Surv 2:107–144
Brockwell P, Davis R (1991) Time series: theory and methods. Springer, New York
Cioch W, Knapik O, Leśkow J (2013) Finding a frequency signature for a cyclostationary signal with applications to wheel bearing diagnostics. Mech Syst Signal Process 38:55–64
Dedecker J, Doukhan P, Lang G, Léon JR, Louhichi S, Prieur C (2008) Weak dependence: with examples and applications. Lecture Notes in Statistics. Springer, New York
Dehay D, Hurd HL (1994) Representation and estimation for periodically and almost periodically correlated random processes. In: IEEE Cyclostationarity in communications and signal processing, New York, pp 295–326
Dehay D, Dudek A, Leśkow J (2014) Subsampling for continuous-time almost periodically correlated processes. J Stat Plan Inference 150:142–158
Dudek A, Leśkow J, Paparoditis E, Politis D (2013) A generalized block bootstrap for seasonal time series. J Time Ser Anal 35:89–114
Doukhan P, Prohl S, Robert CY (2011) Subsampling weakly dependent times series and application to extremes. TEST 20(3):487–490
Doukhan P, Lang G (2002) Rates in the empirical central limit theorem for stationary weakly dependent random fields. Statis Inf Stoch Porc 5:199–228 MR1917292
Doukhan P (1994) Mixing: properties and examples, vol 85. Lecture Notes in Statistics. Springer, New York
Doukhan P, Louhichi S (1999) A new weak dependence condition and applications to moment inequalities. Stoch Proc Appl 84:313–342
Evans M, Hastings N, Peacock B (2000) Statistical distributions, 3rd edn. Wiley, New York, pp 74–76
Guégan D, Ladoucette S (2001) Non-mixing properties of long memory sequences, C.R. Acad. Sci. Paris, 333, pp 373–376
Ferrara L, Guégan D (2001) Forecasting with k-factor Gegenbauer processes: theory and applications. J Forecast 20(8):581–601
Fitzsimmons P, McElroy T (2006) On joint Fourier-Laplace transforms, Mimeo. http://www.math.ucsd.edu/politis/PAPER/FL.pdf
Gajecka-Mirek E (2014) Cyclostationarity: Theory and Methods. Subsampling for weakly dependent and periodically correlated sequences. LNME. Springer, Cham. https://doi.org/10.1007/978-3-319-04187-2_4
Gardner WA, Napolitano A, Paura L (2006) Cyclostationarity: half a century of research. Signal Process. 86:639–697
Granger CWJ, Joyeux J (1980) An introduction to long memory time series models and fractional differencing. J. Time Series Anal. 1:15–30
Gray HL, Zhang N-F, Woodward WA (1989) On generalized fractional processes. J Time Ser Anal 10:233–257
Gray HL, Zhang N-F, Woodward WA (1994) On generalized fractional processes - a correction. J Time Ser Anal 15(5):561–562
Harmantzis F, Hatzinakos D (2001) Network traffic modeling and simulation using stable FARIMA processes. Technical report. Presented at INFORMS Annual Meeting, Miami
Hosking JRM (1981) Fractional differencing. Biometrika 68:165–176
Hosking JRM (1982) Some models of persistence in time series. In: Anderson OD (ed) Time series analysis: theory and practice, vol 1. North-Holland, Amsterdam
Hui YV, Li WK (1995) On Fractionally differenced periodic processes. Indian J Stat Ser B 57(1):19–31
Hurd HL, Miamee AG (2007) Periodically correlated random sequences: spectral theory and practice. Wiley, Hoboken
Hurd H, Makagon A, Miamee AG (2002) On AR(1) models with periodic and almost periodic coefficients. Stoch Process Appl 100:167–185
Hall P, Jing B, Lahiri S (1998) On the sampling window method for long-range dependent data. Stat Sin 8:1189–1204
Ibragimov IA, Rozanov YA (1978) Gaussian random processes. Springer, New York
Jach A, McElroy T, Politis DN (2012) Subsampling inference for the mean of heavy-tailed long memory time series. J Time Ser Anal 33(1):96–111
Kolmogorov AN, Rozanov YA (1960) On strong mixing conditions for stationary Gaussian processes. Theory Prob Appl 5:204–208
Künsch H (1989) The jackknife and the bootstrap for general stationary observations. Ann Statist 17:1217–1241
Lahiri S (1993) On the moving block bootstrap under long-range dependence. Stat Probab Lett 18:405–413
Lawrance AJ (1977) Kottegoda N T (1977) Stochastic modeling of river flow time series. J. Royal Stast Soc Ser B 140:1–47
Logan BF, Mallows CL, Rice SO, Shepp LA (1973) Limit distributions of self-normalized sums. Ann Probab 1:788–809
Marinucci D (2005) The empirical process for bivariate sequences with long memory. Statist Inf Stoch Proc 8(2):205–223
McElroy T, Politis DN (2007) Self-normalization for heavy-tailed time series with long memory. Stat Sin 17(1):199–220
McLeod AI, Hipel KW (1978) Preservation of the rescaled adjusted range, I a reassessment of the Hurst phenomenon. Water Resour Res 14:491–508
NordPool (2016) Nord pool spot. http://www.nordpoolspot.com/. Accessed 15 July 2017
Philippe A, Surgalis D, Viano MC (2006) Almost periodically correlated processes with long-memory, vol 187. Lecture Notes in Statistics. Springer, New York
Politis DN, Romano JP, Wolf M (1999) Subsampling. Springer, New York
Proietti T, Haldrup N, Knapik O (2017) Spikes and memory in (Nord Pool) electricity price spot prices, CREATES Research Paper 2017-39
Rosenblatt M (1956) A central limit theorem and a strong mixing condition. Proc Nat Acad Sci USA 42:43–47
Taqqu MS, Samorodntsky G (1994) Stable non-Gaussian random processes. Stochastic models with infinite variance. Chapman and Hall, New York
van der Vaart AW (1998) Asymptotic Statistics. Cambridge University Press, Cambridge
Acknowledgment
Both Authors would like to express their enormous gratitude to Professor Paul Doukhan for his invaluable help while preparing this manuscript. The Author Leśkow is supported by the grant no 2013/10/M/ST1/00096 from Polish National Center for Science NCN.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
In the appendix the proofs of some results are presented.
Proof of Fact 1
The \(\{GG_{s+pT}\}_{s\in {Z}},\) for each \(s=1,\ldots ,T\) is a second ordered stationary process.
Assume that the process \(\{GG_{s+pT}\}\) satisfies
-
\(\alpha -\)mixing condition
-
completely regular condition and
-
completely linear regular condition
For Gaussian processes the relationship between above coefficient is as follows [36]: \(\rho (k)=r(k),\) and \(\alpha (k)\le r(k)\le 2\pi \alpha (k).\) From the theorem below the Gaussian - Gegenbauer process can’t be strong mixing. [19] Since \(\{\varepsilon _{t}\}_{t\in {Z}}\) in Eq. (1) is a Gaussian process, the long memory stationary k-factor Gegenbauer process (1) is not completely regular and hence is not strong mixing. From [15] we know that the Gaussian - Gegenbauer process has weak dependence properties. \(\square \)
Proof of Theorem 3
The proof of the Theorem 3 is in [22], here only main points will be repeated.
Let: \(N(\hat{\eta }_{N}(s)-\eta (s) )=\sum _{p=0}^{N-1}Y_{s+pT},\ \ s = 1,2,...,T\) where \(Y_{t}=X_{t}-\eta (t) =\sigma _{t}GG_{t}.\)
First we need to show that \(N^{-\zeta }\sum _{p=0}^{N-1}Y_{s+pT}\) converge weakly to some random variable.
Let \(\mathscr {E}\) be the \(\sigma -\)field: \(\mathscr {E} = \sigma (\varepsilon ) = \sigma (\varepsilon _{t}, t \in {Z}).\) Let \(\mathscr {G}\) be the \(\sigma -\)field: \(\mathscr {G} = \sigma (GG) = \sigma (GG_{t}, t \in {Z}).\) From the assumption A in the definition of the model (2) the \(\sigma -\)fields \(\mathscr {E}\) and \(\mathscr {G}\) are independent with respect to the probability measure P. The properties of the characteristic function of normal variable sum are used in the proof.
where \(\nu \) is any real number and \(s = 1,2,...,T.\)
The inner conditional characteristic function, from the properties of Gaussian characteristic function is
This double sum is divided into the diagonal and off-diagonal terms:
In the case \(1/\alpha > (\beta + 1)/2\) the off-diagonal part of (9) is \(O_P(N^{1-2/\alpha }N^{\beta })\) which tends to zero as \(N \rightarrow \infty .\) The characteristic function of the diagonal part of (9) is the characteristic function of a \(S\alpha S\) variable with scale \(\sqrt{\gamma _{G}(0)/2}=|f_{s}|\sqrt{\gamma _{N}(0)/2},\) for \(s=1,2,...,T\). (see [42]).
In the case \(1/\alpha < (\beta + 1)/2\) the formula (9) becomes
The first term is \(O_{P} (N^{2/\alpha -(\beta +1)})\) and tends to zero as \(N\rightarrow \infty .\)
The limiting characteristic function of the second part is characteristic function of a mean zero Gaussian with variance \(\tilde{C}(s)=f_{t}(C(s)-\gamma _{G}(0){I}_{\{\beta =0\}}).\)
The case \(1/\alpha = (\beta +1)/2\) is the combination of the two above cases. From the Slutsky’s Theorem we get weak convergence the sum of two independent random variables.
The next step is to show joint convergence of the first and second moment of the model (2), for each \(s=1,..,N\) and \(p=0,...,N-1.\)
In the proof the joint Fourier/Laplace Transform of the first and second sample moments [21] is considered.
For any real \(\theta \) and \(\phi > 0,\)
The sequence of random variables \(W_{s}\) is i.i.d. standard normal, and is independent of the \(Y_{s}\) series. The information about \(W_{s}\) is denoted by \(\mathscr {W}.\) The double sum in the Fourier/Laplace Transform is broken into diagonal and off-diagonal terms.
The off-diagonal term is
In the case \(2/\alpha > \beta + 1,\) by the Markov inequality the off-diagonal term tends to zero in probability as \(N \rightarrow \infty .\)
In the case \(2/\alpha < \beta + 1\) (11) tends to constant (see the proof of Theorem 1 [42]).
In the case \(2/\alpha = \beta + 1,\) the off-diagonal part, for fixed \(s=1,2,...,T,\) tends to a constant.
The diagonal term is examined separately (by Dominated Convergence Theorem and above fact). Let \(V_{s+pT}=\theta +\sqrt{2\phi }W_{s+pT}\)
While \(2/\alpha < \beta + 1\) the sum \(N^{-\alpha \zeta }\sum _{p=0}^{N-1}|V_{s+pT}|^{\alpha }{\mathop {\longrightarrow }\limits ^{p}} 0.\)
While \(2/\alpha \ge \beta + 1\) (by the Law of Large Numbers) \(N^{-\alpha \zeta }\sum _{p=0}^{N-1}|V_{s+pT}|^{\alpha }{\mathop {\longrightarrow }\limits ^{p}}E|V|^{\alpha }.\) By the Dominated Convergence Theorem, the limit as \(N \longrightarrow \infty \) can be taken through the expectation, so that \(E[exp\{-(\gamma _{G}(0)/2)^{\alpha /2}N^{-\alpha \zeta }\sum _{p=0}^{N-1}|V_{s+pT}|^{\alpha }\}]\rightarrow exp\{-(\gamma _{G}(0)/2)^{\alpha /2}E|\theta +\sqrt{2\phi }N|^{\alpha }1_{2/\alpha \ge \beta +1}\}.\) Using the Fourier/Laplace Transform and argumentation as in [42] we get the proof of this part of the Theorem 3.
Using argumentation as in [42] and in [35] we obtain for \(s=1,...,T\)
At last from the Slutsky Theorem we get the convergence 5.
The second convergence in (5) follows from the continuous mapping theorem (denominators are different from zero). \(\square \)
Proof of Theorem 4
Let us consider a sequence of statistics \(P_{N}(s),\) for fixed \(s=1,2,...,T\) and \(N=1,2,...\).
\(L_{N}(s)(x)=P(Z_{N}(s)\le x)\) is cumulative distribution function of \(P_{N}(s).\)
There exist
For overlapping samples the number of subsamples:
\(Y_{b_s,p}(s)=(X_{N,s+pT},X_{s+(p+1)T}...,X_{N,s+(p+b_s-1)T}),\) \(p=0,1,...,N-b_s\) and the number of subsampling statistics:
\(P_{N,b_s,p}(s) = \sqrt{b_s}(\hat{\eta }_{N,b_s,p}(s)-\hat{\eta }_{N}(s))/\hat{\sigma }_{N,b_s,p}(s) \) is \(N-b_s+1.\)
Above statistics are used to approximate the distributions \(L_{N}(s)(x)\) by empirical distribution functions: \(L_{N,b_s,p}(s)(x)=\frac{1}{N-b+1}\sum _{p=0}^{N-b_s}{I}_{\{P_{N,b_s,p}(s)\le x\}}.\)
For \(A_{N}(s)\) let define subsampled statistics:
The sequence \(\varepsilon _n\) is decreasing to zero and \(\varphi \) is the non-increasing continuous function such that \(\varphi = 1\) or 0 according to \(x \le 0\) or \(x \ge 1\) and which is affine between 0 and 1.
It is enough to investigate the variance of \(U_{N,b_s,p}(s),\ \ s=1,...,T\) (Theorem 11.3.1 [46]).
here \(\gamma (h)=Cov(\varphi (A_{N}(s)),\varphi (A_{N}(s))).\)
From the assumption that we have \(\lambda -\)weak dependency
For \(\varepsilon _N = (N^{2(1-\beta )b})^{1/4}\) above covariance converge to zero. From Cesaro mean argument \(Var U_{N,b_s,p}(s)(x)\) also goes to zero.
Under condition A1 through A4 and the Theorem 3.1, [50] we have:
It implies that \(Var(V_{N,b_s,p}(s)(x))\) tends to zero, it proves point 1. of the Theorem 4.
To prove the point 2. of the Theorem 4 we also use the Theorem 2 in [14].
in probability. The proof of point 3. if the 1. holds and under assumption of the model (2) is the same as the proof of 3. in the Theorem 11.3.1 [46]. \(\square \)
Proof of the Lemma 2
The proof of the Lemma 2 strictly follows from Lemma 2, [5] and Theorem 2, [14]. \(\square \)
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Gajecka-Mirek, E., Leśkow, J. (2020). Subsampling for Heavy Tailed, Nonstationary and Weakly Dependent Time Series. In: Chaari, F., Leskow, J., Zimroz, R., Wyłomańska, A., Dudek, A. (eds) Cyclostationarity: Theory and Methods – IV. CSTA 2017. Applied Condition Monitoring, vol 16. Springer, Cham. https://doi.org/10.1007/978-3-030-22529-2_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-22529-2_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22528-5
Online ISBN: 978-3-030-22529-2
eBook Packages: EngineeringEngineering (R0)