, Volume 22, Issue 4, pp 432-444

Evaluating rescaled range analysis for time series

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Rescaled range analysis is a means of characterizing a time series or a one-dimensional (1-D) spatial signal that provides simultaneously a measure of variance and of the long-term correlation or “memory,” The trend-corrected method is based on the statistical self-similarity in the signal: in the standard approach one measures the ratioR/S on the rangeR of the sum of the deviations from the local mean divided by the standard deviationS from the mean. For fractal signalsR/S is a power law function of the length τ of each segment of the set of segments into which the data set has been divided. Over a wide range of τ's the relationship is:R/S=aτ M, wherek is a scalar and theH is the Hurst exponent. (For a 1-D signalf(t), the exponentH=2-D, withD being the fractal dimension.) The method has been tested extensively on fractional Brownian signals of knownH to determine its accuracy, bias, and limitations.R/S tends to give biased estimates ofH, too low forH>0.72, and too high forH<0.72. Hurst analysis without trend correction differs by finding the rangeR of accumulation of differences from the global mean over the total period of data accumulation, rather than from the mean over each τ. The trend-corrected method gives better estimates ofH on Brownian fractal signals of knownH whenH≥0.5, that is, for signals with positive correlations between neighboring elements. Rescaled range analysis has poor convergence properties, requiring about 2,000 points for 5% accuracy and 200 for 10% accuracy. Empirical corrections to the estimates ofH can be made by graphical interpolation to remove bias in the estimates. Hurst's 1951 conclusion that many natural phenomena exhibit not random but correlated time series is strongly affirmed.