Skip to main content
Log in

Estimation of the regression slope by means of Gini’s cograduation index

  • Published:
Decisions in Economics and Finance Aims and scope Submit manuscript

Abstract

The simple linear model \(Y_i = \alpha + \beta \, x_i + \epsilon _i\) \((i=1,2, \ldots ,N \ge 2)\) is considered, where the \(x_i\)’s are given constants and \(\epsilon _1, \epsilon _2 , \ldots , \epsilon _N\) are independent identically distributed (iid) with continuous distribution function F. An estimator \(\tilde{\beta }\) of the slope parameter is proposed, based on a stochastic process which makes use of Gini’s cograduation index. The properties of \(\tilde{\beta }\) and of the related confidence interval are studied. Some comparisons are given, in terms of asymptotic relative efficiency, with other estimators of \(\beta \) including that obtained with the method of least squares.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. As later shown, \(G(\underline{y};b)\) is actually a non-increasing step function; hence, the stated condition does not imply that \(\tilde{\beta }\) is a root of the equation \(G(\underline{y};b)=0,\) which might not admit any root.

References

  • Adichie, J.N.: Estimators of regression parameters based on rank tests. Ann. Math. Stat. 38, 894–904 (1967)

    Article  Google Scholar 

  • Beran, R.J.: On distribution-free statistical inference with upper and lower probabilities. Ann. Math. Stat. 42, 157–158 (1971)

    Article  Google Scholar 

  • Cifarelli, D.M., Regazzini, E.: On a distribution-free test of independence based on Gini’s rank association coefficient. In: Barra, J.R., Brodeau, F., Romier, G., Van Cutsem, B. (Eds.) Recent developments in Statistics. Proceedings of the European Meeting of Statisticians (Grenoble, 6–11 sept. 1976), North-Holland, Amsterdam, pp. 375–385 (1977)

  • Eicker, F.: Asymptotic normality and consistency of least squares estimators for families of linear regressions. Ann. Math. Stat. 34, 447–456 (1963)

    Article  Google Scholar 

  • Gini, C.: Sul criterio di concordanza tra due caratteri. Atti del Reale Istituto Veneto di Scienze, Lettere ed Arti 309–331 (1915–1916)

  • Hájek, J.: Asymptotically most powerful rank-order tests. Ann. Math. Stat. 33, 1124–1147 (1962)

    Article  Google Scholar 

  • Hájek, J., Šidák, Z.: Theory of Rank Tests. Academic Press, London (1967)

    Google Scholar 

  • Herzel, A.: Sulla distribuzione campionaria dell’indice di cograduazione del Gini. Metron 30, 137–153 (1972)

    Google Scholar 

  • Hodges, J.L., Lehmann, E.L.: Estimates of location based on rank tests. Ann. Math. Stat. 34, 598–611 (1963)

    Article  Google Scholar 

  • Hoeffding, W.: A non parametric test of independence. Ann. Math. Stat. 19, 546–557 (1948)

    Article  Google Scholar 

  • Hogg, R.V., Randles, R.H.: Adaptive distribution free regression methods and their applications. Technometrics 17, 399–407 (1975)

    Article  Google Scholar 

  • Koul, H.L.: Asymptotic behavior of Wilcoxon type confidence regions in multiple linear regression. Ann. Math. Stat. 40, 1950–1979 (1969)

    Article  Google Scholar 

  • Michetti, B., Dall’Aglio, G.: La differenza semplice media. Statistica 17, 159–255 (1957)

    Google Scholar 

  • Mood, A.M.: An Introduction to the Theory of Statistics. McGraw-Hill, New York (1950)

    Google Scholar 

  • Muliere, P.: Una nota intorno al coefficiente di correlazione tra l’indice G di cograduazione di Gini e l’indice \(\tau \) di Kendall. Giornale degli Economisti e Annali di Economia 9/10, 627–633 (1976)

  • Sen, P.K.: Estimates of regression coefficient based on Kendall’s tau. J. Am. Stat. Assoc. 63, 1379–1389 (1968)

    Article  Google Scholar 

  • Theil, H.: A rank-invariant method of linear and polynomial regression analysis, I, II, III. Nederl. Akad. Westensch. Proc. 53, 386–392, 521–525, 1397–1412 (1950)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to D. Michele Cifarelli.

Additional information

Translation from Italian of the paper: Cifarelli, D.M. (1978). “La stima del coefficiente di regressione mediante l’indice di cograduazione di Gini”, Rivista di matematica per le scienze economiche e sociali (now: Decisions in economics and finance), 1, 7–38.

Appendix

Appendix

To prove Theorem 2 of Sect. 7, some preliminary results will be considered. Let f be a probability density function with support in \(\mathfrak {R}\) and define the two probability measures

$$\begin{aligned} P_N(A) = \int _A \, \prod _{1 \le i \le N} f(y_i) \quad \text{ and } \quad Q_N(A) = \int _A \, \prod _{1 \le i \le N} f \left( y_i + \frac{b}{T} (x_i-\bar{x}) \right) \end{aligned}$$

where \(x_1< x_2< \cdots < x_N\) as usual, \(b \ne 0\) is finite, A is any event and

$$\begin{aligned} T^2 = \sum _{1 \le i \le N} (x_i - \bar{x})^2 \qquad M = \max _{1 \le i \le N} (x_i - \bar{x})^2 . \end{aligned}$$

Lemma 1

[Hájek and Šidák (1967, p. 208); Hájek (1962, p. 1134)] If the vector

$$\begin{aligned} \left( \sqrt{N} \, G(\underline{Y};0) \, , \, \log \frac{\displaystyle {\prod _{1 \le i \le N}} \, f \left( Y_i + \frac{b}{T} (x_i - \bar{x}) \right) }{\displaystyle {\prod _{1 \le i \le N}} f(Y_i)} \right) \end{aligned}$$

converges, with the measure \(P_N\), to the Gaussian distribution with parameters

$$\begin{aligned} \left( \mu \, , \, - \frac{1}{2} \sigma _2^2 \, , \, \sigma _1^2 \, , \, \sigma _2^2 \, , \, \sigma _{12} \right) \end{aligned}$$

then the variable

$$\begin{aligned} \sqrt{N} \, G(\underline{Y};0) \end{aligned}$$

converges, with the measure \(Q_N\), to the Gaussian distribution with mean \(\mu + \sigma _{12}\) and variance \(\sigma _1^2.\)

Lemma 2

[Hájek and Šidák (1967, p. 213); Hájek (1962, p. 1136)]. If \(\displaystyle I(f) = \int \left( \frac{f'}{f} \right) ^2 f < \infty \) and if \(\displaystyle {\frac{T^2}{M} \rightarrow \infty },\) then

$$\begin{aligned}&{P_N\lim }_{N \rightarrow \infty } \Bigg ( \log \frac{\displaystyle {\prod _{1 \le i \le N}} f \left( Y_i + \frac{b}{T} (x_i-\bar{x}) \right) }{\displaystyle {\prod _{1 \le i \le N}} \, f(Y_i)} - \frac{b}{T} \sum _{1 \le i \le N} (x_i - \bar{x}) \frac{f'(Y_i)}{f(Y_i)} \\&\quad + \frac{b}{2} \, I(f) \Bigg ) =0 \end{aligned}$$

where \(P_N \lim \) denotes the limit in \(P_N\)-probability.

Remark to Lemma 2

According to the measure \(P_N,\) the variable

$$\begin{aligned} \frac{b}{T} \sum _{1 \le i \le N} (x_i - \bar{x}) \, \frac{f'(Y_i)}{f(Y_i)} \end{aligned}$$

has variance

$$\begin{aligned} \text{ Var } \left( \frac{b}{T} \sum _{1 \le i \le N} (x_i - \bar{x}) \, \frac{f'(Y_i)}{f(Y_i)} \right) = b^2 \, I(f) < \infty \end{aligned}$$

and expectation (Hájek 1962, p. 1125)

$$\begin{aligned} \text{ E } \left( \frac{b}{T} \sum _{1 \le i \le N} (x_i - \bar{x}) \, \frac{f'(Y_i)}{f(Y_i)} \right) = 0 . \end{aligned}$$

Moreover, the variable

$$\begin{aligned} \sum _{1 \le i \le N} Z_{2i} = \sum _{1 \le i \le N} \frac{\displaystyle {\frac{b}{T} \, (x_i - \bar{x}) \, \frac{f'(Y_i)}{f(Y_i)}}}{\sqrt{b^2 I (f)}} \end{aligned}$$

satisfies the Lindeberg–Feller condition. Indeed, after defining

$$\begin{aligned} Z_{2i} (\delta ) = Z_{2i} \, \, s \left( |Z_{2i}| - \delta \right) \end{aligned}$$

where \(\delta >0\) and s(x) equals 1 if \(x\ge 0\) and 0 elsewhere, such a condition can be written as

$$\begin{aligned} \sum _{1 \le i \le N} \text{ E } \left( Z_{2i}^2 (\delta ) \right) = \sum _{1 \le i \le N} \int _{|z| \ge \delta } z^2 \, d P_N \left\{ \frac{b}{T} (x_i - \bar{x}) \frac{f'(Y_i)}{f(Y_i)} \le z \, \sqrt{b^2 \, I(f)} \right\} \rightarrow 0 . \end{aligned}$$

However, by putting \(t = z \, \sqrt{b^2 \, I(f)},\) one gets

$$\begin{aligned}&\sum _{1 \le i \le N} \frac{1}{b^2 \, I(f)} \, \int _{|t| \ge \delta \sqrt{b^2 \, I(f)}} \, t^2 \, { d P_N} \left\{ \frac{b}{T}(x_{i}-\bar{x}) \frac{f'(Y_i)}{f(Y_i)} \le t \right\} \\&\quad = \frac{1}{T^2} \, \sum _{1 \le i \le N} \frac{(x_i- \bar{x})^2}{I(f)} \int _{|y| \ge \delta \sqrt{I(f)} \left| \frac{T}{x_i - \bar{x}} \right| } \, y^2 \, d P_N \left\{ \frac{f'(Y_i)}{f(Y_i)} \le y \right\} \rightarrow 0 \end{aligned}$$

because, by hypothesis,

$$\begin{aligned} \frac{T^2}{M} \rightarrow + \infty \quad \Rightarrow \quad \left| \frac{T}{x_i- \bar{x}} \right| \rightarrow + \infty . \end{aligned}$$

Lemma 3

If \(F'=f\) and if

$$\begin{aligned} \hat{G}(\underline{Y};0) = \frac{2N}{D} \sum _{1 \le i \le N} \left( \left| 1 - \frac{i}{N} - F(Y_i) \right| - \left| \frac{i}{N} - F(Y_i) \right| \right) \end{aligned}$$

then

$$\begin{aligned} {P_N \lim }_{N \rightarrow \infty } \, \sqrt{N} \, (G(\underline{Y};0) - \hat{G}(\underline{Y};0)) = 0. \end{aligned}$$

Proof

By using the identity \( |x| = 2x \, s(x) - x \) \((x \in \mathfrak {R}),\) the definition in (2) and the expression of \(\hat{G}(\underline{Y}; 0),\) one gets

$$\begin{aligned} \sqrt{N} (G(\underline{Y};0) - \hat{G} (\underline{Y};0)) = A_N + B_N + C_N + D_N, \end{aligned}$$

where

$$\begin{aligned} A_N&=\frac{4 \, N^{3/2}}{D} \, \sum _{1 \le i \le N} \left( 1- \frac{i}{N} - F(Y_i) \right) \left[ s(N+1-i-R(Y_i)) \right. \\&\quad \left. - s(N-i-N\, F(Y_i)) \right] \\ B_N&=- \frac{4 \, N^{3/2}}{D} \, \sum _{1 \le i \le N} \left( \frac{i}{N} - F(Y_i) \right) \left[ s(i-R(Y_i)) - s(i-N\, F(Y_i)) \right] \\ C_N&=\frac{4 \, N^{3/2}}{D} \, \sum _{1 \le i \le N} \left( \frac{R(Y_i)}{N} - F(Y_i) \right) \left[ s(i - R(Y_i)) - s(N+1-i-R(Y_i)) \right] \\ D_N&=\frac{2 \, N^{1/2}}{D} \, \sum _{1 \le i \le N} s(N+1-i-R(Y_i)) . \end{aligned}$$

It follows that

$$\begin{aligned} \text{ E } \{ |D_N| \}&\le \frac{2 \, N^{3/2}}{D} \, \rightarrow 0 \\ \text{ E } \{ |B_N| \}&\le \frac{4 \, N^{3/2}}{D} \sum _{1 \le i \le N} \text{ E } \left\{ \left| \frac{i}{N} - F(Y_i) \right| \, \, \left| s(i-R(Y_i)) - s(i-NF(Y_i)) \right| \right\} . \end{aligned}$$

By using the joint distribution of \((Y_i, R(Y_i)),\) that is

$$\begin{aligned}&\text{ Pr } \{ R(Y_i) = r \, ; \, y< Y_i < y+\hbox {d}y \} = \frac{1}{N} \, g_{Y_{(r)}} (y) \, \hbox {d}y \\&\quad = \frac{1}{N} \frac{N!}{(N-r)! \, (r-1)!} \, [F(y)]^{r-1} \, [1-F(y)]^{N-r} \, f(y) \, \hbox {d}y \quad r=1,2, \ldots , N; y \in \mathfrak {R}, \end{aligned}$$

one gets

$$\begin{aligned}&\text{ E } \left\{ \left| \frac{i}{N} - F(Y_i) \right| \, \, \left| s(i-R(Y_i)) - s(i-NF(Y_i)) \right| \right\} \\&= \frac{1}{N} \sum _{1 \le r \le N} \int _{- \infty }^{+ \infty } \left| \frac{i}{N} - F(y) \right| \, \, \left| s(i-r) - s(i-NF(y)) \right| \, g_{Y_{(r)}} (y) \, \hbox {d}y \\&= \sum _{1 \le r \le N} \int _0^1 \left| \frac{i}{N} - v \right| \, \, \left| s(i-r) - s(i-Nv) \right| {N-1 \atopwithdelims ()r-1} \, v^{r-1} \, (1-v)^{N-r} \, dv \\&=\int _0^1 \left| \frac{i}{N} - v \right| \left[ \left( 1- s(i-Nv) \right) \, \mathrm{Pr}\{ U_{(i)} > v \} + s(1-Nv) \, \mathrm{Pr}\{ U_{(i)} \le r \} \right] \, dv , \end{aligned}$$

where \(U_{(i)}\) is the i-th order statistic of a \((N-1)\)-sized random sample drawn from a uniform population in (0, 1). By partitioning the integration interval, after some trivial passages, one gets

so that

$$\begin{aligned} \text{ E } \{ |B_N| \} \le \frac{2 \, N^{3/2}}{D} \, \sum _{1 \le i \le N} \frac{i(N-i)}{N^2(N+1)} \quad \rightarrow 0 . \end{aligned}$$

By following similar steps, one can prove that

$$\begin{aligned} \text{ E } \{ |A_N| \} \rightarrow 0 . \end{aligned}$$

Now let

$$\begin{aligned} S_{i,N} = \left( \frac{R(Y_i)}{N} - F(Y_i) \right) \left[ s ( i - R(Y_i) ) - s(N+1-i-R(Y_i)) \right] \qquad i =1, \ldots , N . \end{aligned}$$

and simply consider that

$$\begin{aligned} \text{ E } (C_N) = \frac{4 \, N^{3/2}}{D} \, \sum _{1 \le i \le N} \text{ E } \left( S_{i,N} \right) =0 . \end{aligned}$$

Moreover,

$$\begin{aligned} \text{ Var } (C_N) = \frac{16 \, N^3}{D^2} \sum _{1 \le i \le N} \text{ E } \left( S_{i,N}^2 \right) + \frac{16 \, N^3}{D^2} \sum _{i \ne j} \, \text{ E } \left( S_{i,N} \, S_{j,N} \right) \end{aligned}$$
(19)

and

$$\begin{aligned} \sum _{1 \le i \le N} \text{ E } (S_{i,N}^2)= & {} \sum _{1 \le i \le N} \frac{1}{N} \sum _{1 \le r \le N} \int _{-\infty }^{+ \infty } \left( \frac{r}{N} - F(y) \right) ^2 \cdot \\&\cdot \, \left[ s(i-r) -s(N+1-i-r) \right] ^2 \, g_{Y_{(r)}} (y) \, \hbox {d}y \\\le & {} \sum _{1 \le r \le N} \int _{-\infty }^{+\infty } \left( \frac{r}{N} - F(y) \right) ^2 \, g_{Y_{(r)}}(y)\,\hbox {d}y \\\le & {} \sum _{1 \le r \le N} \text{ E } \left\{ \left( F(Y_{(r)}) -\frac{r}{N} \right) ^2 \right\} \, \rightarrow \, A < +\infty \end{aligned}$$

being that

$$\begin{aligned} \text{ E } \left\{ \left( F(Y_{(r)}) - \frac{r}{N} \right) ^2 \right\} = \frac{r(N-r+1)}{(N+2)\, (N+1)^2} + \frac{r^2}{N^2 \, (N+1)^2} \qquad \forall r = 1,2, \ldots , N . \end{aligned}$$

The first summand in the right-hand side of (19) thus tends to zero. Moreover,

$$\begin{aligned}&\left| \sum _{i \ne j} \text{ E } (S_{i,N} \, S_{j,N}) \right| \\&\quad = \left| \frac{1}{N(N-1)} \, \sum _{i \ne j} \, \sum _{r \ne k} \int _{-\infty }^{+\infty } \int _{-\infty }^{+\infty } \left( \frac{r}{N} - F(x) \right) \, \left( \frac{k}{N} - F(y) \right) \right. \\&\quad \left( s(i-r) - s(N+1-i-r) \right) \, \left( s(j-k) - s (N+1-j-k) \right) \\&g_{Y_{(r)},Y_{(k)}} (x,y) \, \hbox {d}x \, \hbox {d}y \Big | \\&\quad = \left| \frac{1}{N \, (N-1)} \sum _{r \ne k} \left[ \int _{-\infty }^{+\infty } \int _{-\infty }^{+\infty } \left( \frac{r}{N} - F(x) \right) \, \left( \frac{k}{N} - F(y) \right) \, g_{Y_{(r)}, Y_{(k)}} (x,y) \right. \right. \\&\quad \hbox {d}x \, \hbox {d}y \Big ] \sum _{i \le i \le N} \left( s(i-r) -s(N+1-i-r) \right) \, \left( s(i-k) - s(N+1-i-k) \right) \big | . \end{aligned}$$

Now, as

$$\begin{aligned}&\int _{-\infty }^{+\infty } \int _{-\infty }^{+\infty } \left( \frac{r}{N} - F(x) \right) \, \left( \frac{k}{N} - F(y) \right) \, g_{Y_{(r)}, Y_{(k)}} (x,y) \, \hbox {d}x \, \hbox {d}y \\&\quad = \text{ Cov } \left\{ F(Y_{(r)}) \, , \, F(Y_{(k)}) \right\} + \frac{rk}{N^2 \, (N+1)^2} \\&\quad = \left\{ \begin{array}{l@{\quad }l} \dfrac{r(N+1-k)}{(N+2)\, (N-1)^2} + \dfrac{rk}{N^2 \, (N+1)^2}>0 &{}\quad r<k \\ \dfrac{k(N+1-r)}{(N+2)\, (N+1)^2} + \dfrac{rk}{N^2 \, (N+1)^2}>0 &{}\quad r>k \end{array} \right. \end{aligned}$$

one obtains

$$\begin{aligned}&\left| \sum _{i \ne j} \text{ E } (S_{i,N} \, S_{j,N}) \right| \\&\quad \le \frac{1}{N-1} \sum _{r \ne k} \left\{ \text{ Cov } (F(Y_{(r)}) \, , \, F(Y_{(k)})) + \frac{rk}{N^2 \, (N+1)^2} \right\} \, \rightarrow \, B < + \infty \end{aligned}$$

so that the second summand in (19) tends to zero as well. The proof follows then by a suitable application of Thchebycheff’s inequality to the four variables. \(\square \)

Remark to Lemma 3

Lemma 3 makes it possible to obtain the asymptotic distribution of Gini’s cograduation index under indifference in an alternative way with respect to a former paper (Cifarelli and Regazzini 1977). Indeed, Lemma 3 assures that \(\sqrt{N} \, G(\underline{Y};0)\) is asymptotically distributed as \(\sqrt{N} \, \hat{G} (\underline{Y}; 0),\) for which the classical limit theorems can be applied, because it can be regarded as a sum of independent variables. As a matter of fact, the variable

$$\begin{aligned} \sqrt{N} \, \hat{G}(\underline{Y}; 0) = \frac{2 \, N^{3/2}}{D} \, \sum _{1 \le i \le N} \left( \left| 1-\frac{i}{N} - F(Y_i) \right| - \left| \frac{i}{N} - F(Y_i) \right| \right) \end{aligned}$$

has mean and variance

$$\begin{aligned} \text{ E } (\sqrt{N} \, \hat{G}(\underline{Y};0)) = 0 \qquad \text{ Var } (\sqrt{N} \, \hat{G}(\underline{Y};0)) \simeq \frac{2}{3} \end{aligned}$$

and, by letting

$$\begin{aligned} \sum _{1 \le i \le N} \dfrac{\frac{2 \, N^{3/2}}{D} \, \left( | 1 - \frac{i}{N} - F(Y_i) | - | \frac{i}{N} - F(Y_i) | \right) }{\sqrt{\frac{2}{3}}} = \sum _{1 \le i \le N} Z_{1i} \end{aligned}$$

and, for every \(\delta >0,\)

$$\begin{aligned} Z_{1i} (\delta ) = Z_{1i} \, s(|Z_{1i} - \delta |), \end{aligned}$$

the Lindeberg condition is satisfied, i.e.,

$$\begin{aligned} \sum _{1 \le i \le N} \text{ E } (Z_{1i}^2 (\delta )) \, \rightarrow \, 0 \qquad \forall \delta >0. \end{aligned}$$

Lemma 4

If \(F' = f,\) \(I(f) < +\infty ,\) \(\int |f'| < + \infty \) and \(\frac{T^2}{M} \rightarrow + \infty ,\) then the vector

$$\begin{aligned} \left( \sqrt{N} \, G (\underline{Y};0) \, , \, \log \frac{\displaystyle {\prod _{1 \le i \le N}} \, f \left( Y_i + \frac{b}{T} (x_i - \bar{x}) \right) }{\displaystyle {\prod _{1 \le i \le N}} f(Y_i)} \right) \end{aligned}$$

converges in distribution, with the measure \(P_N,\) to the bivariate Gaussian with parameters

$$\begin{aligned} \left( 0, \quad -\frac{b^2}{2} I(f), \quad \frac{2}{3} , \quad b^2 I(f), \quad \sigma _{12} \right) \end{aligned}$$

where

Proof

By Lemmas 2 and 3, it suffices to show that the vector

$$\begin{aligned} \left( \sqrt{N} \, \hat{G} (\underline{Y};0) \, , \, \frac{b}{T} \sum _{1 \le i \le N} (x_i -\bar{x}) \dfrac{f'(Y_i)}{f(Y_i)} \right) \end{aligned}$$

converges in distribution to the bivariate Gaussian with parameters

$$\begin{aligned} \left( 0, \quad 0, \quad \frac{2}{3} , \quad b^2 I(f), \quad \sigma _{12} \right) . \end{aligned}$$

By means of the remarks following Lemmas 2 and 3, the limiting distribution surely takes the first four parameters listed above. Moreover, consider that

$$\begin{aligned}&\text{ Cov } \left\{ \sqrt{N} \hat{G} (\underline{Y};0)\, , \, \frac{b}{T} \sum _{1 \le i \le N} (x_i - \bar{x}) \dfrac{f'(Y_i)}{f(Y_i)} \right\} \\&\quad = \frac{2 \, N^{3/2}}{D \, T} \, b \, \int _{-\infty }^{+\infty } f'(y) \, \left[ \sum _{1 \le i \le N} (x_i - \bar{x}) \left( \left| 1 - \frac{i}{N} - F(y) \right| - \left| \frac{i}{N} - F(y) \right| \right) \right] \, \hbox {d}y \\&\quad = 4b \, \int _{- \infty }^{+ \infty } f'(y) \left[ \frac{N^{3/2}}{D \, T} \left( \sum _{i=1}^{[N\, (1-F(y))]} (x_i - \bar{x}) \left( 1-F(y) - \frac{i}{N} \right) \right. \right. \\&\qquad - \left. \left. \sum _{i=1}^{[N\, F(y)]} (x_i - \bar{x}) \left( F(y) - \frac{i}{N} \right) \right) \right] \, \hbox {d}y . \end{aligned}$$

By passing to the limit (with N) under the integral sign, one gets

$$\begin{aligned} \sigma _{12}&= \lim _{N \rightarrow + \infty } \text{ Cov } \left\{ \sqrt{N} \hat{G} (\underline{Y};0)\, , \, \frac{b}{T} \sum _{1 \le i \le N} (x_i - \bar{x}) \dfrac{f'(Y_i)}{f(Y_i)} \right\} \\&= 4b \, \int _{- \infty }^{+ \infty } \left[ \psi (1-F(y)) - \psi (F(y)) \right] \, f'(y) \, \hbox {d}y \\&= 4b \, \int _0^1 \left[ \psi (1-v) - \psi (v) \right] \, \dfrac{f'(F^{-1}(v))}{f(F^{-1}(v))} \, dv. \end{aligned}$$

To prove that the limiting distribution is Gaussian, one can then show that, for every real \(\lambda _1\) and \(\lambda _2,\) the following variable is asymptotically normally distributed:

$$\begin{aligned} \lambda _1 \, \sqrt{N} \, \hat{G} (\underline{Y};0)+ \lambda _2 \, \frac{b}{T} \sum _{1 \le i \le N} (x_i -\bar{x}) \dfrac{f'(Y_i)}{f(Y_i)} . \end{aligned}$$

However, as both the variables

$$\begin{aligned} \lambda _1 \, \sqrt{N} \, \hat{G} (\underline{Y};0) \quad \text{ and } \quad \lambda _2 \, \frac{b}{T} \sum _{1 \le i \le N} (x_i -\bar{x}) \dfrac{f'(Y_i)}{f(Y_i)} \end{aligned}$$

satisfy the Lindeberg condition, one obtains the aimed result as in Hájek and Šidák (1967, p. 218). \(\square \)

The proof of Theorem 2 in Sect. 7 now immediately follows from Lemma 1 and Lemma 4. Indeed, for every real z

$$\begin{aligned} Q_N \left\{ \sqrt{N} \, G(\underline{Y};0) \le z \right\}&= \int _{\left\{ \sqrt{N} \, G(\underline{y} ; 0) \le z \right\} } \prod _{1 \le i \le N} \, f \left( y_i + \frac{b}{T} (x_i - \bar{x}) \right) \\&= \int _{\left\{ \sqrt{N} \, G \left( \underline{y} ; \frac{b}{T} \right) \le z \right\} } \prod _{1 \le i \le N} \, f(y_i) \\&= P_N \left\{ \sqrt{N} \, G \left( \underline{Y} ; \frac{b}{T} \right) \le z \right\} \end{aligned}$$

and, by Lemmas 1 and 4,

$$\begin{aligned} \lim _{N \rightarrow + \infty } Q_N \left\{ \sqrt{N} \, G(\underline{Y};0) \le z \right\} = \phi \left( \frac{z-\sigma _{12}}{\sqrt{2/3}} \right) . \end{aligned}$$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cifarelli, D.M. Estimation of the regression slope by means of Gini’s cograduation index. Decisions Econ Finan 39, 113–142 (2016). https://doi.org/10.1007/s10203-016-0174-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10203-016-0174-4

Keywords

JEL Classification

Navigation