Skip to main content

Subsampling for Heavy Tailed, Nonstationary and Weakly Dependent Time Series

  • Conference paper
  • First Online:
Cyclostationarity: Theory and Methods – IV (CSTA 2017)

Part of the book series: Applied Condition Monitoring ((ACM,volume 16))

Included in the following conference series:

  • 258 Accesses

Abstract

We present new results on estimation of the periodic mean function of the periodically correlated time series that exhibits heavy tails and long memory under a weak dependence condition. In our model that is a generalization of the work of McElroy and Politis [35, 42] we show that the estimator of the periodic mean function has an asymptotic distribution that depends on the degree of heavy tails and the degree of the long memory. Such an asymptotic distribution clearly poses a problem while trying to build the confidence intervals. Thus the main point of this research is to establish the consistency of one of the resampling methods - the subsampling procedure - in the considered model. We obtain such consistency under relatively mild conditions on time series at hand. The selection of the block length plays an important role in the resmapling methodology. In the article we discuss as well one of the possible ways of selection the length the subsampling window. We illustrate our results with simulated data as well as with real data set corresponding to Nord Spool data. For such data we consider practical issues of constructing the confidence band for the periodic mean function and the choice of the subsampling window.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Arcones M (1994) Limit theorems for nonlinear functionals of a stationary sequence of vectors. Ann Prob 22:2242–2274

    Article  MathSciNet  MATH  Google Scholar 

  2. Bardet J-M, Doukhan P, Lang G, Ragache N (2008) Dependent Lindeberg central limit theorem and some applications. ESAIM: Probab Stat 12:154–172

    Article  MathSciNet  MATH  Google Scholar 

  3. Beran J, Feng T, Ghosh S, Kulik R (2013) Long-memory processes. Probabilistic Properties and Statistical Methods. Springer, Heidelberg

    Book  MATH  Google Scholar 

  4. Bertail P (2011) Comments on “Subsampling weakly dependent time series and application to extremes”. TEST 20(3):487–490

    Article  MathSciNet  MATH  Google Scholar 

  5. Bertail P, Haefke C, Politis DN, White W (2004) Subsampling the distribution of diverging statistics with applications to finance. J Econ 120:295–326

    Article  MathSciNet  MATH  Google Scholar 

  6. Bickel P, Bühlmann P.: A new mixing notion and functional central limit theorems for a sieve bootstrap in time series, Bernoulli, 5-3, pp. 413–446 (1999)

    Google Scholar 

  7. Bradley R (2005) Basic properties of strong mixing conditions. Probab Surv 2:107–144

    Article  MathSciNet  MATH  Google Scholar 

  8. Brockwell P, Davis R (1991) Time series: theory and methods. Springer, New York

    Book  MATH  Google Scholar 

  9. Cioch W, Knapik O, Leśkow J (2013) Finding a frequency signature for a cyclostationary signal with applications to wheel bearing diagnostics. Mech Syst Signal Process 38:55–64

    Article  Google Scholar 

  10. Dedecker J, Doukhan P, Lang G, Léon JR, Louhichi S, Prieur C (2008) Weak dependence: with examples and applications. Lecture Notes in Statistics. Springer, New York

    MATH  Google Scholar 

  11. Dehay D, Hurd HL (1994) Representation and estimation for periodically and almost periodically correlated random processes. In: IEEE Cyclostationarity in communications and signal processing, New York, pp 295–326

    Google Scholar 

  12. Dehay D, Dudek A, Leśkow J (2014) Subsampling for continuous-time almost periodically correlated processes. J Stat Plan Inference 150:142–158

    Article  MathSciNet  MATH  Google Scholar 

  13. Dudek A, Leśkow J, Paparoditis E, Politis D (2013) A generalized block bootstrap for seasonal time series. J Time Ser Anal 35:89–114

    Article  MathSciNet  MATH  Google Scholar 

  14. Doukhan P, Prohl S, Robert CY (2011) Subsampling weakly dependent times series and application to extremes. TEST 20(3):487–490

    Article  MathSciNet  MATH  Google Scholar 

  15. Doukhan P, Lang G (2002) Rates in the empirical central limit theorem for stationary weakly dependent random fields. Statis Inf Stoch Porc 5:199–228 MR1917292

    Article  MathSciNet  MATH  Google Scholar 

  16. Doukhan P (1994) Mixing: properties and examples, vol 85. Lecture Notes in Statistics. Springer, New York

    Book  MATH  Google Scholar 

  17. Doukhan P, Louhichi S (1999) A new weak dependence condition and applications to moment inequalities. Stoch Proc Appl 84:313–342

    Article  MathSciNet  MATH  Google Scholar 

  18. Evans M, Hastings N, Peacock B (2000) Statistical distributions, 3rd edn. Wiley, New York, pp 74–76

    MATH  Google Scholar 

  19. Guégan D, Ladoucette S (2001) Non-mixing properties of long memory sequences, C.R. Acad. Sci. Paris, 333, pp 373–376

    Google Scholar 

  20. Ferrara L, Guégan D (2001) Forecasting with k-factor Gegenbauer processes: theory and applications. J Forecast 20(8):581–601

    Article  Google Scholar 

  21. Fitzsimmons P, McElroy T (2006) On joint Fourier-Laplace transforms, Mimeo. http://www.math.ucsd.edu/politis/PAPER/FL.pdf

  22. Gajecka-Mirek E (2014) Cyclostationarity: Theory and Methods. Subsampling for weakly dependent and periodically correlated sequences. LNME. Springer, Cham. https://doi.org/10.1007/978-3-319-04187-2_4

    Chapter  Google Scholar 

  23. Gardner WA, Napolitano A, Paura L (2006) Cyclostationarity: half a century of research. Signal Process. 86:639–697

    Article  MATH  Google Scholar 

  24. Granger CWJ, Joyeux J (1980) An introduction to long memory time series models and fractional differencing. J. Time Series Anal. 1:15–30

    Article  MathSciNet  MATH  Google Scholar 

  25. Gray HL, Zhang N-F, Woodward WA (1989) On generalized fractional processes. J Time Ser Anal 10:233–257

    Article  MathSciNet  MATH  Google Scholar 

  26. Gray HL, Zhang N-F, Woodward WA (1994) On generalized fractional processes - a correction. J Time Ser Anal 15(5):561–562

    Article  MathSciNet  MATH  Google Scholar 

  27. Harmantzis F, Hatzinakos D (2001) Network traffic modeling and simulation using stable FARIMA processes. Technical report. Presented at INFORMS Annual Meeting, Miami

    Google Scholar 

  28. Hosking JRM (1981) Fractional differencing. Biometrika 68:165–176

    Article  MathSciNet  MATH  Google Scholar 

  29. Hosking JRM (1982) Some models of persistence in time series. In: Anderson OD (ed) Time series analysis: theory and practice, vol 1. North-Holland, Amsterdam

    Google Scholar 

  30. Hui YV, Li WK (1995) On Fractionally differenced periodic processes. Indian J Stat Ser B 57(1):19–31

    MathSciNet  MATH  Google Scholar 

  31. Hurd HL, Miamee AG (2007) Periodically correlated random sequences: spectral theory and practice. Wiley, Hoboken

    Book  MATH  Google Scholar 

  32. Hurd H, Makagon A, Miamee AG (2002) On AR(1) models with periodic and almost periodic coefficients. Stoch Process Appl 100:167–185

    Article  MathSciNet  MATH  Google Scholar 

  33. Hall P, Jing B, Lahiri S (1998) On the sampling window method for long-range dependent data. Stat Sin 8:1189–1204

    MathSciNet  MATH  Google Scholar 

  34. Ibragimov IA, Rozanov YA (1978) Gaussian random processes. Springer, New York

    Book  MATH  Google Scholar 

  35. Jach A, McElroy T, Politis DN (2012) Subsampling inference for the mean of heavy-tailed long memory time series. J Time Ser Anal 33(1):96–111

    Article  MathSciNet  MATH  Google Scholar 

  36. Kolmogorov AN, Rozanov YA (1960) On strong mixing conditions for stationary Gaussian processes. Theory Prob Appl 5:204–208

    Article  MathSciNet  MATH  Google Scholar 

  37. Künsch H (1989) The jackknife and the bootstrap for general stationary observations. Ann Statist 17:1217–1241

    Article  MathSciNet  MATH  Google Scholar 

  38. Lahiri S (1993) On the moving block bootstrap under long-range dependence. Stat Probab Lett 18:405–413

    Article  MathSciNet  MATH  Google Scholar 

  39. Lawrance AJ (1977) Kottegoda N T (1977) Stochastic modeling of river flow time series. J. Royal Stast Soc Ser B 140:1–47

    Article  Google Scholar 

  40. Logan BF, Mallows CL, Rice SO, Shepp LA (1973) Limit distributions of self-normalized sums. Ann Probab 1:788–809

    Article  MathSciNet  MATH  Google Scholar 

  41. Marinucci D (2005) The empirical process for bivariate sequences with long memory. Statist Inf Stoch Proc 8(2):205–223

    Article  MathSciNet  MATH  Google Scholar 

  42. McElroy T, Politis DN (2007) Self-normalization for heavy-tailed time series with long memory. Stat Sin 17(1):199–220

    MathSciNet  MATH  Google Scholar 

  43. McLeod AI, Hipel KW (1978) Preservation of the rescaled adjusted range, I a reassessment of the Hurst phenomenon. Water Resour Res 14:491–508

    Article  Google Scholar 

  44. NordPool (2016) Nord pool spot. http://www.nordpoolspot.com/. Accessed 15 July 2017

  45. Philippe A, Surgalis D, Viano MC (2006) Almost periodically correlated processes with long-memory, vol 187. Lecture Notes in Statistics. Springer, New York

    MATH  Google Scholar 

  46. Politis DN, Romano JP, Wolf M (1999) Subsampling. Springer, New York

    Book  MATH  Google Scholar 

  47. Proietti T, Haldrup N, Knapik O (2017) Spikes and memory in (Nord Pool) electricity price spot prices, CREATES Research Paper 2017-39

    Google Scholar 

  48. Rosenblatt M (1956) A central limit theorem and a strong mixing condition. Proc Nat Acad Sci USA 42:43–47

    Article  MathSciNet  MATH  Google Scholar 

  49. Taqqu MS, Samorodntsky G (1994) Stable non-Gaussian random processes. Stochastic models with infinite variance. Chapman and Hall, New York

    Google Scholar 

  50. van der Vaart AW (1998) Asymptotic Statistics. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

Download references

Acknowledgment

Both Authors would like to express their enormous gratitude to Professor Paul Doukhan for his invaluable help while preparing this manuscript. The Author Leśkow is supported by the grant no 2013/10/M/ST1/00096 from Polish National Center for Science NCN.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Elżbieta Gajecka-Mirek .

Editor information

Editors and Affiliations

Appendix

Appendix

In the appendix the proofs of some results are presented.

Proof of Fact 1

The \(\{GG_{s+pT}\}_{s\in {Z}},\) for each \(s=1,\ldots ,T\) is a second ordered stationary process.

Assume that the process \(\{GG_{s+pT}\}\) satisfies

  • \(\alpha -\)mixing condition

  • completely regular condition and

  • completely linear regular condition

For Gaussian processes the relationship between above coefficient is as follows [36]: \(\rho (k)=r(k),\) and \(\alpha (k)\le r(k)\le 2\pi \alpha (k).\) From the theorem below the Gaussian - Gegenbauer process can’t be strong mixing. [19] Since \(\{\varepsilon _{t}\}_{t\in {Z}}\) in Eq. (1) is a Gaussian process, the long memory stationary k-factor Gegenbauer process (1) is not completely regular and hence is not strong mixing. From [15] we know that the Gaussian - Gegenbauer process has weak dependence properties.    \(\square \)

Proof of Theorem 3

The proof of the Theorem 3 is in [22], here only main points will be repeated.

Let: \(N(\hat{\eta }_{N}(s)-\eta (s) )=\sum _{p=0}^{N-1}Y_{s+pT},\ \ s = 1,2,...,T\) where \(Y_{t}=X_{t}-\eta (t) =\sigma _{t}GG_{t}.\)

First we need to show that \(N^{-\zeta }\sum _{p=0}^{N-1}Y_{s+pT}\) converge weakly to some random variable.

Let \(\mathscr {E}\) be the \(\sigma -\)field: \(\mathscr {E} = \sigma (\varepsilon ) = \sigma (\varepsilon _{t}, t \in {Z}).\) Let \(\mathscr {G}\) be the \(\sigma -\)field: \(\mathscr {G} = \sigma (GG) = \sigma (GG_{t}, t \in {Z}).\) From the assumption A in the definition of the model (2) the \(\sigma -\)fields \(\mathscr {E}\) and \(\mathscr {G}\) are independent with respect to the probability measure P. The properties of the characteristic function of normal variable sum are used in the proof.

$$Eexp \{i\nu N^{-1/\alpha }\sum _{p=0}^{N-1} Y_{s+pT}\}=E[E[exp\{i\nu N^{-1/\alpha } \sum _{p=0}^{N-1}\sigma _{s+pT}GG_{s+pT}\}|\mathscr {E}]]$$

where \(\nu \) is any real number and \(s = 1,2,...,T.\)

The inner conditional characteristic function, from the properties of Gaussian characteristic function is

$$\begin{aligned}&E[exp\{ i\nu N^{-1/\alpha } \sum _{p=0}^{N-1}\sigma _{s+pT}GG_{s+pT}\}|\mathscr {E}]\\&\quad = exp\{-\frac{({\nu N^{-1/\alpha })^{2}}}{2}\sum _{p,q = 0}^{N-1}\sigma _{s+pT}\sigma _{s+qT}f_{s+pT}f_{s+qT}\gamma _{N}(T(p-q))\},\ s=1,...,T. \end{aligned}$$

This double sum is divided into the diagonal and off-diagonal terms:

$$\begin{aligned} N^{-\frac{2}{\alpha }} \big (\sum _{p=0}^{N-1}\sigma ^{2}_{s+pT}\gamma _{G}(0)+\sum _{p\not = q}\sigma _{s+pT}\sigma _{s+qT}\gamma _{G}((p-q)T)\big ) \end{aligned}$$
(9)

In the case \(1/\alpha > (\beta + 1)/2\) the off-diagonal part of (9) is \(O_P(N^{1-2/\alpha }N^{\beta })\) which tends to zero as \(N \rightarrow \infty .\) The characteristic function of the diagonal part of (9) is the characteristic function of a \(S\alpha S\) variable with scale \(\sqrt{\gamma _{G}(0)/2}=|f_{s}|\sqrt{\gamma _{N}(0)/2},\) for \(s=1,2,...,T\). (see [42]).

In the case \(1/\alpha < (\beta + 1)/2\) the formula (9) becomes

$$\begin{aligned} N^{-(\beta +1)}\big ( \sum _{p=0}^{N-1}\sigma _{s+pT}^{2}\gamma _{G}(0)+\sum _{q\not =p} \sigma _{s+pT}\sigma _{s+qT}\gamma _{G}((p-q)T) \big ). \end{aligned}$$
(10)

The first term is \(O_{P} (N^{2/\alpha -(\beta +1)})\) and tends to zero as \(N\rightarrow \infty .\)

The limiting characteristic function of the second part is characteristic function of a mean zero Gaussian with variance \(\tilde{C}(s)=f_{t}(C(s)-\gamma _{G}(0){I}_{\{\beta =0\}}).\)

The case \(1/\alpha = (\beta +1)/2\) is the combination of the two above cases. From the Slutsky’s Theorem we get weak convergence the sum of two independent random variables.

The next step is to show joint convergence of the first and second moment of the model (2), for each \(s=1,..,N\) and \(p=0,...,N-1.\)

In the proof the joint Fourier/Laplace Transform of the first and second sample moments [21] is considered.

For any real \(\theta \) and \(\phi > 0,\)

$$\begin{aligned}&Eexp\{i\theta N^{-\zeta }\sum _{p=0}^{N-1}Y_{s+pT}-\phi N^{-2\zeta }\sum _{p=0}^{N-1}Y_{s+pT}^{2}\}\\&\quad = E[exp\{-\frac{1}{2}N^{-\zeta }\sum _{p,q}^{N-1} \sigma _{s+pT}\sigma _{s+qT}\gamma _{G}((p-q)T)(\theta +\sqrt{2\phi }W_{s+pT})(\theta +\sqrt{2\phi }W_{s+qT}) \}].\end{aligned}$$

The sequence of random variables \(W_{s}\) is i.i.d. standard normal, and is independent of the \(Y_{s}\) series. The information about \(W_{s}\) is denoted by \(\mathscr {W}.\) The double sum in the Fourier/Laplace Transform is broken into diagonal and off-diagonal terms.

The off-diagonal term is

$$\begin{aligned} N^{-2\zeta }\sum _{|h|>0}^{N-1}\sum _{p=0}^{N-|h|}\sigma _{s+pT}\sigma _{s+pT+hT}(\theta +\sqrt{2\phi }W_{s+pT})(\theta +\sqrt{2\phi }W_{s+pT+hT})\gamma _{G}(hT) \end{aligned}$$
(11)

In the case \(2/\alpha > \beta + 1,\) by the Markov inequality the off-diagonal term tends to zero in probability as \(N \rightarrow \infty .\)

In the case \(2/\alpha < \beta + 1\) (11) tends to constant (see the proof of Theorem 1 [42]).

In the case \(2/\alpha = \beta + 1,\) the off-diagonal part, for fixed \(s=1,2,...,T,\) tends to a constant.

The diagonal term is examined separately (by Dominated Convergence Theorem and above fact). Let \(V_{s+pT}=\theta +\sqrt{2\phi }W_{s+pT}\)

$$\begin{aligned}&E[exp\{-\frac{1}{2}\gamma _{G}(0)N^{-2\zeta }\sum _{p=0}^{N-1}\sigma _{s+pT}^{2}V_{s+pT}^{2}\}]\\&\quad = E[exp\{-(\gamma _{G}(0)/2)^{\alpha /2}N^{-\alpha \zeta }\sum _{p=0}^{N-1}|V_{s+pT}|^{\alpha }\}|. \end{aligned}$$

While \(2/\alpha < \beta + 1\) the sum \(N^{-\alpha \zeta }\sum _{p=0}^{N-1}|V_{s+pT}|^{\alpha }{\mathop {\longrightarrow }\limits ^{p}} 0.\)

While \(2/\alpha \ge \beta + 1\) (by the Law of Large Numbers) \(N^{-\alpha \zeta }\sum _{p=0}^{N-1}|V_{s+pT}|^{\alpha }{\mathop {\longrightarrow }\limits ^{p}}E|V|^{\alpha }.\) By the Dominated Convergence Theorem, the limit as \(N \longrightarrow \infty \) can be taken through the expectation, so that \(E[exp\{-(\gamma _{G}(0)/2)^{\alpha /2}N^{-\alpha \zeta }\sum _{p=0}^{N-1}|V_{s+pT}|^{\alpha }\}]\rightarrow exp\{-(\gamma _{G}(0)/2)^{\alpha /2}E|\theta +\sqrt{2\phi }N|^{\alpha }1_{2/\alpha \ge \beta +1}\}.\) Using the Fourier/Laplace Transform and argumentation as in [42] we get the proof of this part of the Theorem 3.

Using argumentation as in [42] and in [35] we obtain for \(s=1,...,T\)

$$\begin{aligned}&N^{-\beta \rho }(\widehat{LM}(\rho ,s))^{\rho }=o_{P}(1) \\&\quad +\Big |N^{-\beta \rho }\sum _{|h|>0}^{[N^{\rho }]}\frac{1}{N-|h|}\sum _{p=0}^{N-|h|}Y_{s+pT}Y_{s+pT+hT}\Big |{\mathop {\longrightarrow }\limits ^{P}}\mu ^{2}C(s).\end{aligned}$$

At last from the Slutsky Theorem we get the convergence 5.

The second convergence in (5) follows from the continuous mapping theorem (denominators are different from zero).    \(\square \)

Proof of Theorem 4

Let us consider a sequence of statistics \(P_{N}(s),\) for fixed \(s=1,2,...,T\) and \(N=1,2,...\).

\(L_{N}(s)(x)=P(Z_{N}(s)\le x)\) is cumulative distribution function of \(P_{N}(s).\)

There exist

$$r_{N}(s)=sup_{x\in {R}}|L_{N}(s)(x)-L(s)(x)|\longrightarrow 0,\ \ N\rightarrow \infty $$

For overlapping samples the number of subsamples:

\(Y_{b_s,p}(s)=(X_{N,s+pT},X_{s+(p+1)T}...,X_{N,s+(p+b_s-1)T}),\) \(p=0,1,...,N-b_s\) and the number of subsampling statistics:

\(P_{N,b_s,p}(s) = \sqrt{b_s}(\hat{\eta }_{N,b_s,p}(s)-\hat{\eta }_{N}(s))/\hat{\sigma }_{N,b_s,p}(s) \) is \(N-b_s+1.\)

Above statistics are used to approximate the distributions \(L_{N}(s)(x)\) by empirical distribution functions: \(L_{N,b_s,p}(s)(x)=\frac{1}{N-b+1}\sum _{p=0}^{N-b_s}{I}_{\{P_{N,b_s,p}(s)\le x\}}.\)

For \(A_{N}(s)\) let define subsampled statistics:

$$U_{N,b_s,p}(s)(x)=\frac{1}{N-b+1}\sum _{p=0}^{N-b}\varphi {(\frac{A_{N}(s)- x}{\varepsilon _n})}. $$

The sequence \(\varepsilon _n\) is decreasing to zero and \(\varphi \) is the non-increasing continuous function such that \(\varphi = 1\) or 0 according to \(x \le 0\) or \(x \ge 1\) and which is affine between 0 and 1.

It is enough to investigate the variance of \(U_{N,b_s,p}(s),\ \ s=1,...,T\) (Theorem 11.3.1 [46]).

$$Var U_{N,b_s,p}(s)(x)=(N-b_s+1)^{-2}(\sum _{|h|<N-b_s+1}(N-b_s+1-|h|)\gamma (h))$$

here \(\gamma (h)=Cov(\varphi (A_{N}(s)),\varphi (A_{N}(s))).\)

From the assumption that we have \(\lambda -\)weak dependency

$$Cov(\varphi (A_{N,p}(s)),\varphi (A_{N,p+h}(s)))\le \sqrt{b_s}\lambda _{h-b+1}/\varepsilon _N$$

For \(\varepsilon _N = (N^{2(1-\beta )b})^{1/4}\) above covariance converge to zero. From Cesaro mean argument \(Var U_{N,b_s,p}(s)(x)\) also goes to zero.

$$V_{N,b_s,p}(s)(x)=\frac{1}{N-b+1}\sum _{p=0}^{N-b}\varphi {(\frac{A_{N}(s)P_{N,b_s,p}(s)- x}{\varepsilon _n})}. $$

Under condition A1 through A4 and the Theorem 3.1, [50] we have:

$$lim_{N\rightarrow \infty }|E[V_{N,b_s,p}(s)(x)-E[V_{N,b_s,p}(s)(x)]]^{2}|=0.$$

It implies that \(Var(V_{N,b_s,p}(s)(x))\) tends to zero, it proves point 1. of the Theorem 4.

To prove the point 2. of the Theorem 4 we also use the Theorem 2 in [14].

$$lim_{N\rightarrow \infty }sup_{x\in {R}}|U_{N,b_s,p}(s)(x)-L(s)(x)|=0,$$

in probability. The proof of point 3. if the 1. holds and under assumption of the model (2) is the same as the proof of 3. in the Theorem 11.3.1 [46].   \(\square \)

Proof of the Lemma 2

The proof of the Lemma 2 strictly follows from Lemma 2, [5] and Theorem 2, [14].    \(\square \)

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gajecka-Mirek, E., Leśkow, J. (2020). Subsampling for Heavy Tailed, Nonstationary and Weakly Dependent Time Series. In: Chaari, F., Leskow, J., Zimroz, R., Wyłomańska, A., Dudek, A. (eds) Cyclostationarity: Theory and Methods – IV. CSTA 2017. Applied Condition Monitoring, vol 16. Springer, Cham. https://doi.org/10.1007/978-3-030-22529-2_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-22529-2_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-22528-5

  • Online ISBN: 978-3-030-22529-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics