Skip to main content

A Gini-based time series analysis and test for reversibility

A Publisher Correction to this article was published on 12 September 2019

This article has been updated

Abstract

Time reversibility is a fundamental hypothesis in time series. In this paper, Gini-based equivalents for time series concepts that enable to construct a Gini-based test for time reversibility under merely first-order moment assumptions are developed. The key idea is that the relationship between two variables using Gini (as measured by Gini autocorrelations and partial autocorrelations) can be measured in two directions, which are not necessarily equal. This implies a built-in capability to discriminate between looking at forward and backward directions in time series. The Gini creates two bi-directional Gini autocorrelations (and partial autocorrelations), looking forward and backward in time, which are not necessarily equal. The difference between them may assist in identifying models with underlying heavy-tailed and non-normal innovations. Gini-based test and Gini-based correlograms, which serve as visual tools to examine departures from the symmetry assumption, are constructed. Simulations are used to illustrate the suggested Gini-based framework and to validate the statistical test. An application to a real data set is presented.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Change history

  • 12 September 2019

    Unfortunately, due to a technical error, the articles published in issues 60:2 and 60:3 received incorrect pagination. Please find here the corrected Tables of Contents. We apologize to the authors of the articles and the readers.

References

  1. Andrews B, Calder M, Davis RA (2009) Maximum likelihood estimation for \(\alpha \)-stable autoregressive processes. Ann Stat 37(4):1946–1982

    MathSciNet  Article  MATH  Google Scholar 

  2. Andrews B, Davis RA (2013) Model identification for infinite variance autoregressive processes. J Econom 172(2):222–234

    MathSciNet  Article  MATH  Google Scholar 

  3. Box GEP, Jenkins GM, Reinsel GC (1994) Time series analysis: forecasting and control, 3rd edn. Prentice-Hall, Englewood Cliffs

    MATH  Google Scholar 

  4. Brockwell PJ, Davis RA (1991) Time series: theory and methods, 2nd edn. Springer, New York

    Book  MATH  Google Scholar 

  5. Carcea M (2014) Contributions to time series modeling under first order moment assumptions. PhD Dissertation, University of Texas at Dallas

  6. Carcea M, Serfling R (2015) A Gini autocovariance function for time series modeling. J Time Ser Anal 36(6):817–838

    Article  MATH  Google Scholar 

  7. Chen YT, Chou RY, Kuan CM (2000) Testing time reversibility without moment restrictions. J Econom 95(1):199–218

    Article  MATH  Google Scholar 

  8. Davis RA, Resnick S (1985) More limit theory for the sample correlation function of moving averages. Stoch Process Their Appl 20(2):257–279

    MathSciNet  Article  MATH  Google Scholar 

  9. Davis RA, Resnick S (1986) Limit theory for the sample covariance and correlation functions of moving averages. Ann Stat 14:533–558

    MathSciNet  Article  MATH  Google Scholar 

  10. Feigin PD, Resnick SI (1999) Pitfalls of fitting autoregressive models for heavy-tailed time series. Extremes 1(4):391–422

    Article  MATH  Google Scholar 

  11. Franses PH, Van Dijk D (2000) Non-linear time series models in empirical finance. Cambridge University Press, Cambridge

  12. Gini C (1914) On the measurement of concentration and variability of characters. Reprinted in Metron 63(2005):3–38

  13. Kunsch HR (1989) The jackknife and the bootstrap for general stationary observations. Ann Stat 17(3):1217–1241

    MathSciNet  Article  MATH  Google Scholar 

  14. Lerman RI, Yitzhaki S (1984) A note on the calculation and interpretation of the Gini index. Econ Lett 15(3):363–368

    Article  Google Scholar 

  15. Liu RY, Singh K (1992) Moving blocks jackknife and bootstrap capture weak dependence. In: Lepage R, Billard L (eds)Exploring the limits of bootstrap. Wiley, New York, p 225–248

  16. Olkin I, Yitzhaki S (1992) Gini regression analysis. Int Stat Rev 60:185–196

    Article  MATH  Google Scholar 

  17. Paulaauskas V, Rachev ST (2003) Maximum likelihood estimators in regression models with infinite variance innovations. Stat Pap 44(1):47–65

    MathSciNet  Article  MATH  Google Scholar 

  18. Psaradakis Z (2008) Assessing time-reversibility under minimal assumptions. J Time Ser Anal 29(5):881–905

    MathSciNet  Article  MATH  Google Scholar 

  19. Ramsey JB, Rothman P (1996) Time irreversibility and business cycle asymmetry. J Money Credit Bank 28(1):1–21

    Article  Google Scholar 

  20. Rubinstein ME (1973) The fundamental theorem of parameter-preference security valuation. J Financ Quant Anal 8:61–69

    Article  Google Scholar 

  21. Schechtman E, Yitzhaki S (1987) A measure of association based on Gini mean difference. Commun Stat Theory Methods 16(1):207–231

    MathSciNet  Article  MATH  Google Scholar 

  22. Schechtman E, Yitzhaki S (1999) On the proper bounds of the Gini correlation. Econ Lett 63(2):133–138

    MathSciNet  Article  MATH  Google Scholar 

  23. Schechtman E, Yitzhaki S, Artsev Y (2008) Who does not respond in the household expenditure survey: an exercise in extended Gini regressions. J Bus Econ Stat 26(3):329–344

    MathSciNet  Article  Google Scholar 

  24. Serfling R (2010) Fitting autoregressive models via Yule-Walker equations allowing heavy tail innovations (preprint)

  25. Serfling R, Xiao P (2007) A contribution to multivariate L-moments: L-comoment matrices. J Multivar Anal 98(9):1765–1781

    MathSciNet  Article  MATH  Google Scholar 

  26. Shelef A (2013) Statistical analyses based on Gini for time series data. PhD Dissertation, Ben-Gurion University of the Negev

  27. Shelef A (2014) A Gini-based unit root test. Comput Stat Data Anal. doi:10.1016/j.csda.2014.08.012

  28. Shelef A, Schechtman E (2011) A Gini-based methodology for identifying and analyzing time series with non-normal innovations (preprint)

  29. Stuart A, Ord JK (1987) Kendall’s advanced theory of statistics, 5 edn, vol 1. Oxford University Press, New York

  30. Wei WWS (1993) Time series analysis. Addison-Wesley, New York

    Google Scholar 

  31. Weiss G (1975) Time-reversibility of linear stochastic processes. J Appl Probab 12:831–836

    MathSciNet  Article  MATH  Google Scholar 

  32. Wodon Q, Yitzhaki S (2006) Convergence forward and backward? Econ Lett 92(1):47–51

    MathSciNet  Article  MATH  Google Scholar 

  33. Yitzhaki S (2003) Gini’s mean difference: a superior measure of variability for non-normal distributions. Metron 61(2):285–316

    MathSciNet  MATH  Google Scholar 

  34. Yitzhaki S, Schechtman E (2013) The Gini methodology. Springer, New York

    Book  MATH  Google Scholar 

Download references

Acknowledgements

The authors thank Shlomo Yitzhaki, Robert Serfling, Yisrael Parmet, Jeff Hart and Gideon Schechtman for their helpful comments on a previous version of this paper. The authors would also like to thank the anonymous reviewers for their thorough review and highly appreciate their comments and suggestions, which significantly contributed to improving the quality of the paper.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Amit Shelef.

Appendices

Appendix 1: The asymptotic distribution of the Gini-ACF estimators

Let

$$\begin{aligned} \hat{{\gamma }}_{(s=0)}^{G} =\mathrm{cov}\left( Y_{t} ,\,R\left( Y_{t} \right) \right) ={\sum _{t=1}^T {\left( Y_{t} -\bar{{Y}}\right) \left( R\left( Y_{t} \right) -\bar{{R}}\left( Y_{1{\text {:}}\,T} \right) \right) } }/{(T-1)}, \end{aligned}$$
(25)

be an estimator of Gini autocovariance of lag 0. This estimator can also be presented as

$$\begin{aligned} \hat{{\gamma }}_{(s=0)}^{G} =\frac{2\sum \nolimits _{t=1}^T {(2t-T-1)Y_{( {t{\text {:}}\,T} )} } }{T(T-1)}, \end{aligned}$$
(26)

where \(Y_{( {t{\text {:}}\,T} )} ,\,t=1,\ldots ,T\) are the ordered (i.e., sorted by size) observations so that \(Y_{(1{\text {:}}\,T)} \le Y_{(2{\text {:}}\,T)} \cdots \le Y_{(T{\text {:}}\,T)} \) (Schechtman and Yitzhaki 1987). Another possibility is to use

$$\begin{aligned} \hat{{\gamma }}^{{\prime }G}_{(s=0)} =\frac{2\sum \nolimits _{t=1}^T {(2t-T)Y_{( {t{\text {:}}\,T} )} } }{T}\, (\mathrm{Carcea\,and\,Serfling}\,2015). \end{aligned}$$
(27)

Carcea (2014) shows that the limiting distribution of the estimator of (27) is normal. We use this result to obtain the limiting distribution of the estimator in (26). In order to do so, we show that the difference between the two estimators [Eqs. (26) and (27)] converges to 0 as \(T\rightarrow \infty . \)

$$\begin{aligned} diff= & {} \hat{{\gamma }}_{(s=0)}^{G} -\hat{{\gamma }}^{\prime {G}}_{(s=0)} =\frac{2\sum \nolimits _{t=1}^T {(2t-T-1)Y_{( {t{\text {:}}\,T})} } }{T(T-1)}-\frac{{2\sum \nolimits _{t = 1}^T {(2t - T){Y_{(t{\text {:}}\,T)}}} }}{{{T^2}}}\\= & {} \frac{2T\sum \nolimits _{t=1}^T {(2t-T-1)Y_{( {t{\text {:}}\,T} )} -2(T-1)\sum \nolimits _{t=1}^T {(2t-T)Y_{( {t{\text {:}}\,T})} } } }{T^{2}(T-1)} \\= & {} \frac{-2T\sum \nolimits _{t=1}^T {Y_{( {t{\text {:}}\,T})} } +2\sum \nolimits _{t=1}^T {(2t-T)Y_{( {t{\text {:}}\,T} )} } }{T^{2}(T-1)}\\= & {} \frac{-4T\sum \nolimits _{t=1}^T {Y_{( {t{\text {:}}\,T} )} } +4\sum \nolimits _{t=1}^T {(t)Y_{( {t{\text {:}}\,T})} } }{T^{2}(T-1)} =\frac{4\sum \nolimits _{t=1}^T {Y_{( {t{\text {:}}\,T} )} (t-T)} }{T^{2}(T-1)}. \end{aligned}$$

By the triangle inequality, \(|diff|\le \frac{4\sum \nolimits _{t=1}^T {| {Y_{( {t{\text {:}}\,T})} }|| {t-T}|} }{T^{2}(T-1)}.\)

Note that \(Y_{t} \) is a bounded sequence. Replace \(| {Y_{( {t{\text {:}}\,T})} }|\) by \(\mathop {\max }\nolimits _{1\le t\le T} | {Y_{( {t\mathbf : T} )} }|\) to get that

$$\begin{aligned} |{ diff}|\le & {} \frac{{4\mathop {\max }\nolimits _{1 \le t \le T} | {{Y_{( {t{\text {:}}\,T})}}} |\sum \nolimits _{t = 1}^T {| {t - T} |} }}{{T^2(T - 1)}} = \frac{{4\mathop {\max }\nolimits _{1 \le t \le T}| {{Y_{( {t{\text {:}}\,T} )}}} |\sum \nolimits _{t = 1}^T {(T - t)} }}{{T^2(T - 1)}}\\= & {} \frac{{4\mathop {\max }\nolimits _{1 \le t \le T} | {{Y_{( {t{\text {:}}\,T})}}}|\frac{{T(T - 1)}}{2}}}{{T^2(T - 1)}} = \frac{{2\mathop {\max }\nolimits _{1 \le t \le T} | {{Y_{( {t{\text {:}}\,T} )}}} |}}{T}. \end{aligned}$$

Therefore, the difference between the two estimators converges to 0 as \(T\rightarrow \infty .\)

Generally let

$$\begin{aligned} \hat{{\gamma }}_{(s)}^{G_{1} }= & {} \mathrm{cov}\left( Y_{t} ,\,R\left( Y_{t-s} \right) \right) =\sum _{t=1}^{T-s} \left( Y_{t+s} -\bar{{Y}}_{(s+1){\text {:}}\,T}\right) \nonumber \\&\left( R\left( Y_{t} \right) -\bar{{R}}\left( Y_{1{\text {:}}\,(T-s)}\right) \right) /{(T-s-1}), \end{aligned}$$
(28)

be an estimator of the Gini autocovariance of lag s where \(\bar{{Y}}_{(s+1){\text {:}}\,T} ={\sum \nolimits _{t=s+1}^T {Y_{t} } }/{( {T-s})}\) (mean of the last \(T-s\) observations). This estimator can also be presented as

$$\begin{aligned} \hat{{\gamma }}_{(s)}^{G_{1} } =\frac{2\sum \nolimits _{t=1}^{T-s} {(2t-(T-s)-1)Y_{(t{\text {:}}\,T-s),1,2} } }{(T-s)(T-s-1)}, \end{aligned}$$
(29)

where \(Y_{(t{\text {:}}\,T-s),1,2} \) is the first component’s value that is concomitant to the tth ordered (i.e., sorted by size) second component’s value, relative to the bivariate pairs of observations in \(S_{1} =\{ {( {Y_{s+1} ,\,Y_{1} }),\,( {Y_{s+2} ,\,Y_{2} } ),\ldots ,( {Y_{T} ,\,Y_{T-s} })} \}.\) In other words, the observations in \(S_{1} \) are ordered by the second component. Another possibility is to use

$$\begin{aligned} \hat{{\gamma }}^{\prime {G_{1}}}_{(s)}=\frac{2\sum \nolimits _{t=1}^{T-s} {(2t-(T-s))Y_{(t{\text {:}}\,T-s),1,2} } }{(T-s)^2}\, (\mathrm{Carcea}\,2014). \end{aligned}$$
(30)

Next, we show that the difference between the two estimators [Eqs. (29) and (30)] converges to 0 as \(T\rightarrow \infty .\)

$$\begin{aligned}&{ diff}=\hat{{\gamma }}_{(s)}^{G_{1} } -\hat{{\gamma }}^{\prime {G_{1}}}_{(s)}\\&\quad =\frac{2\sum \nolimits _{t=1}^{T-s} {(2t-(T-s)-1)Y_{(t{\text {:}}\,T-s),1,2} } }{(T-s)(T-s-1)}-\frac{2\sum \nolimits _{t=1}^{T-s} {(2t-(T-s))Y_{(t{\text {:}}\,T-s),1,2} } }{(T-s)^{2}} \\&\quad =\frac{2(T-s)\sum \nolimits _{t=1}^{T-s} {(2t-(T-s)-1)Y_{(t{\text {:}}\,T-s),1,2} } -2(T-s-1)\sum \nolimits _{t=1}^{T-s} {(2t-(T-s))Y_{(t{\text {:}}\,T-s),1,2} } }{(T-s)^{2}(T-s-1)} \\&\quad =\frac{4\sum \nolimits _{t=1}^{T-s} {tY_{(t{\text {:}}\,T-s),1,2} } -4(T-s)\sum \nolimits _{t=1}^{T-s} {Y_{(t{\text {:}}\,T-s),1,2} } }{(T-s)^{2}(T-s-1)}=\frac{4\sum \nolimits _{t=1}^{T-s} {Y_{(t{\text {:}}\,T-s),1,2} (t-(T-s))} }{(T-s)^{2}(T-s-1)}. \end{aligned}$$

By the triangle inequality, \(|{ diff}|\le \frac{4\sum \nolimits _{t=1}^{T-s} {| {Y_{(t{\text {:}}\,T-s),1,2} } || {t-(T-s)}|} }{(T-s)^{2}(T-s-1)}.\) Similarly to the procedure we used for the difference between the two estimates of the Gini autocovariance of lag 0 here we get that \(|{ diff}| \le \frac{{2\mathop {\max }\nolimits _{1 \le t \le T} | {{Y_{(t{\text {:}}\,T - s),1,2}}} |}}{{T - s}}.\) Therefore, the difference between the two estimators converges to 0 as \(T\rightarrow \infty .\)

Carcea (2014) shows that the estimators in Eqs.  (27) and (30) have asymptotic normal distribution. In addition, as shown above, the differences between these estimators and the estimators in Eqs. (25) and (28) (\(\hat{{\gamma }}_{(s=0)}^{G} \) and \(\hat{{\gamma }}_{(s)}^{G_{1} } )\) converge to 0 as \(T\rightarrow \infty . \) Therefore, \(\hat{{\gamma }}_{(s=0)}^{G} \) and \(\hat{{\gamma }}_{(s)}^{G_{1} } \) are also asymptotically normal.

Following the proof of Theorem 7.2.1 in Brockwell and Davis (1991) and replacing each autocovariance estimator by the relevant Gini autocovariance estimator, we get that the Gini autocorrelation estimator, \(\frac{\hat{{\gamma }}_{(s)}^{G_{1} } }{\hat{{\gamma }}_{(s=0)}^{G} }\) [Eq. (28) divided by Eq. (25)] is asymptotically normal. Specifically, let \(g(\cdot )\) be the function from \(\mathfrak {R}^{s+1}\) into \(\mathfrak {R}^{s}\) defined by

$$\begin{aligned} g\left( \left[ x_{0} ,\,x_{1} ,\ldots ,x_{s}\right] ^{\prime }\right) =\left[ x_{1} /x_{0} ,\ldots ,x_{s} /x_{0}\right] ^{\prime },\quad x_{0} \ne 0. \end{aligned}$$

Then by Proposition 6.4.3 in Brockwell and Davis (1991) and because \(\hat{{\gamma }}_{(s=0)}^{G} \) and \(\hat{{\gamma }}_{(s)}^{G_{1} } \) are asymptotically normal, \(\frac{\hat{{\gamma }}_{(s)}^{G_{1} } }{\hat{{\gamma }}_{(s=0)}^{G} }=g([\hat{{\gamma }}_{(s=0)}^{G} ,\,\hat{{\gamma }}_{(1)}^{G_{1} }, \ldots ,\hat{{\gamma }}_{(s)}^{G_{1} } ]^{\prime })\) is asymptotically normal.

As a result, the first Gini-ACF estimator presented in Eq. (8) can be expressed as \(\hat{{\rho }}_{(s)}^{G_{1} } =\frac{\hat{{\gamma }}_{(s)}^{G_{1} } }{\hat{{\gamma }}_{(s=0)}^{G} }\left( {\frac{T-s-1}{T-1}} \right) ,\) where \(\frac{T-s-1}{T-1}\mathop {\rightarrow }\limits ^{T\rightarrow \infty }1.\) Therefore, the first Gini-ACF estimator [Eq. (8)] is also asymptotically normal. The asymptotic normality of the second Gini-ACF [Eq. (9)] can be obtained in a similar manner.

Appendix 2: Testing procedure for the equality of Gini regression coefficients in AR(1) models

The procedure for testing the hypothesis \(H_{02}{\text {:}}\,\phi _{1}^{G_{1} } =\phi _{1}^{G_{2} } \) (i.e., \(\theta _{\phi _{1} } =\phi _{1}^{G_{1} } -\phi _{1}^{G_{2} } =\)0) versus a two-sided alternative is as follows:

Step 1 estimate the modelscoefficient(s) and the difference between them.

Use the Gini regression to estimate the models’ coefficients, \(\hat{{\phi }}_{1}^{G_{1} } =\frac{\mathrm{cov}(Y_{t} ,\,R(Y_{t-1} ))}{\mathrm{cov}(Y_{t-1} ,\,R(Y_{t-1} ))}\) and \(\hat{{\phi }}_{1}^{G_{2} } =\frac{\mathrm{cov}(Y_{t-1} ,\,R(Y_{t} ))}{\mathrm{cov}(Y_{t} ,\,R(Y_{t} ))}\) [Eqs. (16) and (17)]. The estimator of

$$\begin{aligned} \theta _{\phi _{1} } =\phi _{1}^{G_{1} } -\phi _{1}^{G_{2} }\,\mathrm{is}\,\hat{{\theta }}_{\phi _{1} } =\hat{{\phi }}_{1}^{G_{1} } -\hat{{\phi }}_{1}^{G_{2} } . \end{aligned}$$
(31)

Step 2 perform the MBB resampling procedure on the original series, as in the case of testing for the equality of Gini-ACFs (step 2 in Sect. 4). Denote the bootstrapped series as \(Y_{t}^{*}. \) The bootstrap difference estimator is

$$\begin{aligned} \hat{{\theta }}_{\phi _{1}}^{*} =\hat{{\phi }}_{1}^{{G_{1}}^{*}} -\hat{{\phi }}_{1}^{{G_{2}}^{*}} , \end{aligned}$$
(32)

where \(\hat{{\phi }}_{1}^{{G_{1}}^{*}} \) and \(\hat{{\phi }}_{1}^{{G_{2}}^{*}} \) are the Gini regression estimators calculated on the bootstrapped series.

Step 3 calculate the critical values and perform the test.

Repeat step 2 M times. The procedure for calculating the critical values and performing the test is as in steps 3–4 in Sect. 4, using the difference estimators given in Eq. (31) and the bootstrap difference estimators given in Eq. (32). Note that the critical value is the percentile, defined as the value of the bootstrap statistic such that \((1-\alpha )\cdot 100\% \) of the M calculated statistics are smaller than or equal to it.

Appendix 3: Simulation results for the time reversibility hypotheses of equal Gini-ACFs and Gini-PACFs

Tables 4, 5 and 6 present the rejection probabilities of the hypothesis of time reversibility using the Gini-based test in:

Table 4 Rejection probabilities of the hypothesis of time reversibility using the Gini-based test in AR(1) models, with Normal and Log-normal innovations
Table 5 Rejection probabilities of the hypothesis of time reversibility using the Gini-based test in AR(1) models, with \(\upalpha \)-stable and Pareto innovations
Table 6 Rejection probabilities of the hypothesis of time reversibility using the Gini-based test with Normal, Log-normal, \(\upalpha \)-stable and Pareto innovations, by block size and sample size

Table 4: AR(1) models, with \(T=200\) and 400, Normal and Log-normal innovations.

Table 5: AR(1) models, with \(T=200\) and 400, for selected block sizes, with \({\alpha }\)-stable and Pareto innovations.

Table 6: (a) MA(1) model, (b) ARMA(1, 1) model and (c) ARMA(3, 3) model, with \(T=200\) and 400, for selected block sizes, with Normal, Log-normal, \({\alpha }\)-stable and Pareto innovations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Shelef, A., Schechtman, E. A Gini-based time series analysis and test for reversibility. Stat Papers 60, 687–716 (2019). https://doi.org/10.1007/s00362-016-0845-9

Download citation

Keywords

  • Autocorrelation
  • Autoregression
  • Gini correlation
  • Gini regression
  • Moving block bootstrap
  • Time reversibility

Mathematics Subject Classification

  • 62-07
  • 62G08