Skip to main content
Log in

Resampling time series using missing values techniques

  • Time Series
  • Published:
Annals of the Institute of Statistical Mathematics Aims and scope Submit manuscript

Abstract

Several techniques for resampling dependent data have already been proposed. In this paper we use missing values techniques to modify the moving blocks jackknife and bootstrap. More specifically, we consider the blocks of deleted observations in the blockwise jackknife as missing data which are recovered by missing values estimates incorporating the observation dependence structure. Thus, we estimate the variance of a statistic as a weighted sample variance of the statistic evaluated in a “complete” series. Consistency of the variance and the distribution estimators of the sample mean are established. Also, we apply the missing values approach to the blockwise bootstrap by including some missing observations among two consecutive blocks and we demonstrate the consistency of the variance and the distribution estimators of the sample mean. Finally, we present the results of an extensive Monte Carlo study to evaluate the performance of these methods for finite sample sizes, showing that our proposal provides variance estimates for several time series statistics with smaller mean squared error than previous procedures.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abraham, B. and Thavaneswaran, A. (1991). A nonlinear time series model and estimation of missing observations,Ann. Inst. Statist. Math.,43, 493–504.

    Article  MATH  MathSciNet  Google Scholar 

  • Berk, K. N. (1974). Consistent autoregressive spectral estimates,Ann. Statist.,2, 489–502.

    MATH  MathSciNet  Google Scholar 

  • Beveridge, S. (1992). Least squares estimation of missing values in time series,Comm. Statist. Theory Methods,21, 3479–3496.

    MathSciNet  Google Scholar 

  • Bose, A. (1990). Bootstrap in moving average models,Ann. Inst. Statist. Math.,42, 753–768.

    Article  MATH  MathSciNet  Google Scholar 

  • Bosq, D. (1996).Nonparametric Statistics for Stochastic Processes: Estimation and Prediction, Springer, New York.

    Google Scholar 

  • Bühlmann, P. (1995). Moving-average representation of autoregressive approximations,Stochastic Process. Appl.,60, 331–342.

    Article  MATH  MathSciNet  Google Scholar 

  • Bühlmann, P. (1996). Locally adaptive lag-window spectral estimation,J. Time Ser. Anal.,17, 247–270.

    MATH  MathSciNet  Google Scholar 

  • Bühlmann, P. (1997). Sieve bootstrap for time series,Bernoulli,8, 123–148.

    Article  Google Scholar 

  • Bühlmann, P. and Künsch, H. R. (1994). Block length selection in the bootstrap for time series, R.R. No. 72, Eidgenössische Technische Hochschule.

  • Carlstein, E. (1986). The use of subseries values for estimating the variance of a general statistics from a stationary sequence,Ann. Statist.,14, 1171–1194.

    MATH  MathSciNet  Google Scholar 

  • Carlstein, E., Do, K., Hall, P., Hesterberg, T. and Künsch, H. R. (1998). Matched-block bootstrap for dependent data,Bernoulli,4, 305–328.

    Article  MATH  MathSciNet  Google Scholar 

  • Efron, B. (1979). Bootstrap methods: Another look at the jackknife,Ann. Statist.,7, 1–26.

    MATH  MathSciNet  Google Scholar 

  • Efron, B. and Tibshirani, R. J. (1986). Bootstrap methods for standard errors, confidence intervals, and other measures of statistical accuracy,Statist. Sci.,1, 54–77.

    MATH  MathSciNet  Google Scholar 

  • Freedman, D. A. (1984). On bootstrapping two-stage least-square estimates in stationary linear models,Ann. Statist.,12, 827–842.

    MATH  MathSciNet  Google Scholar 

  • Galbraith, R. F. and Galbraith, J. I. (1974). On the inverse of some patterned matrices arising in the theory of stationary time series,J. Appl. Probab.,11, 63–71.

    Article  MATH  MathSciNet  Google Scholar 

  • Hannan, E. J. and Kavalieris, L. (1986). Regression, autoregression models,J. Time Ser. Anal.,7, 27–49.

    MATH  MathSciNet  Google Scholar 

  • Harvey, A. C. and Pierse, R. G. (1984). Estimating missing observations in economic time series,J. Amer. Statist. Assoc.,79, 125–131.

    Article  Google Scholar 

  • Horn, R. A. and Johnson, C. R. (1990).Matrix Analysis, Cambridge University Press, Cambridge.

    Google Scholar 

  • Kreiss, J. P. and Franke, J. (1992). Bootstrapping stationary autoregressive moving average models,J. Time Ser. Anal.,13, 297–317.

    MATH  MathSciNet  Google Scholar 

  • Künsch, H. R. (1989). The jackknife and the bootstrap for general stationary observations,Ann. Statist.,17, 1217–1241.

    MATH  MathSciNet  Google Scholar 

  • Lahiri, S. N. (2002). On the jackknife-after-bootstrap method for dependent data and its consistency properties,Econometric Theory,18, 79–98.

    Article  MATH  MathSciNet  Google Scholar 

  • Liu, R. Y. and Singh, K. (1992). Moving blocks jackknife and bootstrap capture weak dependence,Exploring the Limits of Bootstrap (eds. R. Lepage and L. Billard), 225–248, Wiley, New York.

    Google Scholar 

  • Peña, D. (1990). Influential observations in time series,J. Bus. Econom. Statist.,8, 235–241.

    Article  Google Scholar 

  • Peña, D. and Maravall, A. (1991). Interpolation, outliers, and inverse autocorrelations,Comm. Statist. Theory Methods,20, 3175–3186.

    MathSciNet  Google Scholar 

  • Politis, D. N. and Romano, J. F. (1992). A circular block-resampling procedure for stationary data,Exploring the Limits of Bootstrap (eds. R. Lepage and L. Billard), 263–270, Wiley, New York.

    Google Scholar 

  • Politis, D. N. and Romano, J. F. (1994a). Large sample confidence regions based on subsamples under minimal assumptions,Ann. Statist.,22, 2031–2050.

    MATH  MathSciNet  Google Scholar 

  • Politis, D. N. and Romano, J. F. (1994b). The stationary bootstrap,J. Amer. Statist. Assoc.,89, 1303–1313.

    Article  MATH  MathSciNet  Google Scholar 

  • Politis, D. N. and Romano, J. F. (1995). Bias-corrected nonparametric spectral estimation,J. Time Ser. Anal.,16, 67–103.

    MATH  MathSciNet  Google Scholar 

  • Quenouille, M. (1949). Approximation test of correlation in time series,J. Roy. Statist. Soc. Ser. B,11, 18–84.

    MathSciNet  Google Scholar 

  • Schwarz, G. (1978). Estimating the dimension of a model,Ann. Statist.,6, 461–464.

    MATH  MathSciNet  Google Scholar 

  • Shao, Q.-M. and Yu, H. (1993), Bootstrapping the sample means for stationary mixing sequences,Stochastic Process. Appl.,48, 175–190.

    Article  MATH  MathSciNet  Google Scholar 

  • Sherman, M. (1998). Efficiency and robustness in subsampling for dependent data,J. Statist. Plann. Inference,75, 133–146.

    Article  MATH  MathSciNet  Google Scholar 

  • Tukey, J. (1958). Bias and confidence in not quite large samples,Ann. Math. Statist.,29, p. 614.

    Google Scholar 

  • White, H. (1984).Asymptotic Theory for Econometricians, Academic Press, New York.

    Google Scholar 

  • Wu, C. F. J. (1990). On the asymptotic properties of the jackknife histogram,Ann. Statist.,18, 1438–1452.

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

About this article

Cite this article

Alonso, A.M., Peña, D. & Romo, J. Resampling time series using missing values techniques. Ann Inst Stat Math 55, 765–796 (2003). https://doi.org/10.1007/BF02523392

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02523392

Key words and phrases

Navigation