Advertisement

Long-Range Dependence

  • S. N. Lahiri
Part of the Springer Series in Statistics book series (SSS)

Abstract

The models considered so far in this book dealt with the case where the data can be modeled as realizations of a weakly dependent process. In this chapter, we consider a class of random processes that exhibit long-range dependence. The condition of long-range dependence in the data may be described in more than one way (cf. Beran (1994), Hall (1997)). For this book, an operational definition of long-range dependence for a second-order stationary process is that the sum of the (lag) autocovariances of process diverges. In particular, this implies that the variance of the sample mean based on a sample of size n from a long-range dependent process decays at a rate slower than O(n −1) as n → ∞. As a result, the scaling factor for the centered sample mean under long-range dependence is of smaller order than the usual scaling factor n 1/2 used in the independent or weakly dependent cases. Furthermore, the limit distribution of the normalized sample mean can be nonnormal. In Section 10.2, we describe the basic framework and review some relevant properties of the sample mean under long-range dependence. In Section 10.3, we investigate properties of the MBB approximation. Here the MBB provides a valid approximation if and only if the limit law of the normalized sample mean is normal. In Section 10.4, we consider properties of the subsampling method under long-range dependence. We show that unlike the MBB, the subsampling method provides valid approximations to the distributions of normalized and studentized versions of the sample mean for both normal and nonnormal limit cases. In Section 10.5, we report the results from a small simulation study on finite sample performance of the subsampling method.

Keywords

Coverage Probability Valid Approximation Auto Covariance Function Stationary Gaussian Process Finite Sample Performance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 2003

Authors and Affiliations

  • S. N. Lahiri
    • 1
  1. 1.Department of StatisticsIowa State UniversityAmesUSA

Personalised recommendations