Skip to main content

A Review and Some New Proposals for Bandwidth Selection in Nonparametric Density Estimation for Dependent Data

Abstract

When assuming independence, the choice of the smoothing parameter in density estimation has been extensively studied. However, when considering dependence, very few papers have dealt with data-driven bandwidth selectors. First of all, we review the state of art of the existing methods for bandwidth selection under dependence. Moreover, three new (or adapted) methods are proposed: (a) an extension to the dependent case of the modified cross-validation by Stute (J Statisti Plann Infer 30:293–305, 1992; b) an adaptation to density estimation of the penalized cross-validation proposed by Estévez-Pérez et al. (J Statisti Plann Infer 104:1–30, 2002) for hazard rate estimation; (c) finally, the smoothed version of the so-called moving blocks bootstrap is established and an exact expression for the bootstrap version of the mean integrated squared error under dependence is obtained in this context. This is useful since Monte Carlo approximation is not needed to implement the bootstrap selector. An extensive simulation study is carried out in order to check and compare the empirical behaviour of six selected bandwidths.

Keywords

  • Bandwidth choice
  • Bootstrap bandwidth
  • Comparison by simulation
  • Cross-validation bandwidth
  • Kernel density estimator
  • Plug-in bandwidth
  • Moving blocks bootstrap
  • Modified cross-validation
  • Penalized cross-validation

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-50986-0_10
  • Chapter length: 36 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   109.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-50986-0
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   149.99
Price excludes VAT (USA)
Hardcover Book
USD   149.99
Price excludes VAT (USA)
Fig. 10.1
Fig. 10.2
Fig. 10.3
Fig. 10.4
Fig. 10.5
Fig. 10.6
Fig. 10.7
Fig. 10.8

References

  • Barbeito I, Cao R (2016) Smoothed stationary bootstrap bandwidth selection for density estimation with dependent data. Comput Statist & Data Anal 104:130–147

    Google Scholar 

  • Bowman A (1984) An alternative method of cross-validation for the smoothing of density estimates. Biometrika 71:353–360

    Google Scholar 

  • Cao R (1993) Bootstrapping the mean integrated squared error. J Mult Anal 45:137–160

    Google Scholar 

  • Cao R (1999) An overview of bootstrap methods for estimating and predicting in time series. Test 8:95–116

    Google Scholar 

  • Cao R, Quintela del Río A, Vilar Fernández J (1993) Bandwidth selection in nonparametric density estimation under dependence. a simulation study. Com Statist 8:313–332

    Google Scholar 

  • Cao R, Cuevas A, González Manteiga W (1994) A comparative-study of several smoothing methods in density-estimation. Comput Statist & Data Anal 17:153–176

    Google Scholar 

  • Chow Y, Geman S, Wu L (1983) Consistent cross-validated density-estimation. Ann Statist 11:25–38

    Google Scholar 

  • Cox D, Kim T (1997) A study on bandwidth selection in density estimation under dependence. J Mult Anal 62:190–203

    Google Scholar 

  • Devroye L (1987) A Course in Density Estimation. Birkhauser, Boston

    MATH  Google Scholar 

  • Efron B (1979) Bootstrap methods: Another look at the jackniffe. Ann Statist 7:1–26

    Google Scholar 

  • Efron B, Tibishirani R (1993) An Introduction to the Bootstrap. Chapman and Hall, New York

    CrossRef  Google Scholar 

  • Estévez-Pérez G, Quintela del Río A, Vieu P (2002) Convergence rate for cross-validatory bandwidth in kernel hazard estimation from dependent samples. J Statisti Plann Infer 104:1–30

    Google Scholar 

  • Faraway J, Jhun M (1990) Bootstrap choice of bandwidth for density estimation. J Amer Statist Assoc 85:1119–1122

    Google Scholar 

  • Feluch W, Koronacki J (1992) A note on modified cross-validation in density-estimation. Comput Statist & Data Anal 13:143–151

    Google Scholar 

  • Hall P (1983) Large sample optimality of least-squares cross-validation in densityestimation. Ann Statist 11:1156–1174

    Google Scholar 

  • Hall P (1990) Using the bootstrap to estimate mean squared error and select smoothing parameter in nonprametric problems. J Multiv Anal 32:177–203

    Google Scholar 

  • Hall P, Marron J (1987a) Extent to which least-squares cross-validation minimizes integrated square error in nonparametric density-estimation. Probab Theor Relat Fields 74:567–581

    Google Scholar 

  • Hall P, Marron J (1987b) On the amount of noise inherent in bandwidth selection for a kernel density estimator. Ann Statist 15:163–181

    Google Scholar 

  • Hall P, Marron J (1991) Local minima in cross-validation functions. J Roy Statist Soc Ser B 53:245–252

    Google Scholar 

  • Hall P, Lahiri S, Truong Y (1995) On bandwidth choice for density estimation with dependent data. Ann Statist 23:2241–2263

    Google Scholar 

  • Hart J, Vieu P (1990) Data-driven bandwidth choice for density estimation based on dependent data. Ann Statist 18:873–890

    Google Scholar 

  • Jones M, Marron J, Park B (1991) A simple root-n bandwidth selector. Ann Statist 19:1919–1932

    Google Scholar 

  • Jones M, Marron J, Sheather S (1996) A brief survey of bandwidth selection for density estimation. J Amer Statist Assoc 91:401–407

    Google Scholar 

  • Kreiss J, Paparoditis E (2011) Bootstrap methods for dependent data: A review. J Korean Statist Soc 40:357–378

    Google Scholar 

  • Künsch H (1989) The jackknife and the bootstrap for general stationary observations. Ann Statist 17:1217–1241

    Google Scholar 

  • Léger C, Romano J (1990) Bootstrap choice of tuning parameters. Ann Inst Statist Math 42:709–735

    Google Scholar 

  • Liu R, Singh K (1992) Moving blocks jackknife and bootstrap capture weak dependence. In Exploring the Limits of Bootstrap, eds R LePage and L Billard pp 225–248

    Google Scholar 

  • Marron J (1985) An asymptotically efficient solution to the bandwidth problem of kernel density-estimation. Ann Statist 13:1011–1023

    Google Scholar 

  • Marron J (1987) A comparison of cross-validation techniques in density estimation. Ann Statist 15:152-162 A review and new proposals for bandwidth selection in density estimation under dependence 31

    Google Scholar 

  • Marron J (1992) Bootstrap bandwidth selection. In Exploring the Limits of Bootstrap, eds R LePage and L Billard pp 249–262

    Google Scholar 

  • Marron J, Wand M (1992) Exact mean integrated squared error. Ann Statist 20(2):712–736

    Google Scholar 

  • Park B, Marron J (1990) Comparison of data-driven bandwidth selectors. J Amer Statist Assoc 85:66–72

    Google Scholar 

  • Parzen E (1962) Estimation of a probability density-function and mode. Ann Math Statist 33:1065–1076

    Google Scholar 

  • Politis D, Romano J (1994) The stationary bootstrap. J Amer Statist Assoc 89:1303–1313

    Google Scholar 

  • Rosenblatt M (1956) Estimation of a probability density-function and mode. Ann Math Statist 27:832–837

    CrossRef  MATH  MathSciNet  Google Scholar 

  • Rudemo M (1982) Empirical choice of histograms and kernel density estimators. Scand J Statist 9:65–78

    Google Scholar 

  • Scott D, Terrell G (1987) Biased and unbiased cross-validation in density-estimation. J Amer Statist Assoc 82:1131–1146

    Google Scholar 

  • Sheather S, Jones M (1991) A reliable data-based bandwidth selection method for kernel density estimation. J Roy Statist Soc Ser B 53:683–690

    Google Scholar 

  • Silverman B, Young G (1987) The bootstrap: To smooth or not to smooth? Biometrika 74:469–479

    Google Scholar 

  • Silverman BW (1986) Density Estimation for Statistics and Data Analysis. Chapman & Hall, London

    CrossRef  MATH  Google Scholar 

  • Stone C (1984) An asymptotically optimal window selection rule for kernel density estimates. Ann Statist 12:1285–1297

    Google Scholar 

  • Stute W (1992) Modified cross-validation in density-estimation. J Statisti Plann Infer 30:293–305

    Google Scholar 

  • Taylor C (1989) Bootstrap choice of the smoothing parameter in kernel density estimation. Biometrika 76:705–712

    Google Scholar 

Download references

Acknowledgements

The authors acknowledge partial support by MINECO grant MTM2014-52876-R and by the Xunta de Galicia (Grupos de Referencia Competitiva ED431C-2016-015 and Centro Singular de Investigación de Galicia ED431G/01), all of them through the ERDF. They would also like to thank an anonymous referee for his/her comments that have helped to improve this chapter.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Inés Barbeito .

Editor information

Editors and Affiliations

10.5 Appendix

10.5 Appendix

Proof

(Proof of Theorem 1) Let us take into account a random sample \(\left( X_1,\ldots ,X_n\right) \) which comes from a stationary process and the smoothed moving blocks bootstrap version of the kernel density estimator, \(\hat{f}^*_h(x)\). The bootstrap version of the mean integrated squared error is given by:

$$\begin{aligned} MISE^*(h)= B^*(h) + V^*(h), \end{aligned}$$
(10.6)

where

$$\begin{aligned} B^*(h)= & {} \int \left[ \mathbb {E}^*\left( \hat{f}_h^*(x)\right) - \hat{f}_g(x)\right] ^2 dx\text {, and }\\ V^*(h)= & {} \int \text {Var}^*\left( \hat{f}_h^*(x)\right) dx.\\ \end{aligned}$$

Now, straight forward calculations lead to

$$\begin{aligned} B^*(h)= & {} \int \left[ \mathbb {E}^*\left( \dfrac{1}{n}\sum \limits _{i=1}^n K_h(x-X_i^*) \right) - \hat{f}_g(x)\right] ^2 dx \\= & {} \int \left[ \dfrac{1}{n}\sum \limits _{i=1}^n \int K_h(x-y)\hat{f}_g^{(i)}(y)dy - \hat{f}_g(x)\right] ^2 dx, \end{aligned}$$

where

$$\hat{f}_g^{(i)}(y)= \dfrac{1}{n-b+1} \sum \limits _{j=t_i}^{n-b+t_i} K_g(y-X_j), $$

considering \(t_i=[(i-1)\text {mod }b]+1\).

Let us now assume that n is an integer multiple of b:

$$\begin{aligned}&\int \left[ \dfrac{1}{n}\sum \limits _{i=1}^n \int K_h(x-y)\hat{f}_g^{(i)}(y)dy - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \dfrac{1}{n}\sum \limits _{i=1}^b \dfrac{n}{b} \left( K_h *\hat{f}_g^{(i)}\right) (x) - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \dfrac{1}{b}\sum \limits _{i=1}^b \left( K_h *\hat{f}_g^{(i)}\right) (x) - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \dfrac{1}{b}\sum \limits _{i=1}^b \left( \dfrac{1}{n-b+1} \sum \limits _{j=t_i}^{n-b+t_i} K_h*K_g(\cdot -X_j) \right) (x) - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \dfrac{1}{b}\sum \limits _{i=1}^b \left( \dfrac{1}{n-b+1} \sum \limits _{j=t_i}^{n-b+t_i} \int K_h(x-y)K_g(y-X_j)dy \right) - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \dfrac{1}{b}\sum \limits _{i=1}^b \left( \dfrac{1}{n-b+1} \sum \limits _{j=t_i}^{n-b+t_i} \int K_h(x-u-X_j)K_g(u)du \right) - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \dfrac{1}{b}\sum \limits _{i=1}^b \left( \dfrac{1}{n-b+1} \sum \limits _{j=t_i}^{n-b+t_i} K_h *K_g (x-X_j) \right) - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \dfrac{1}{b(n-b+1)}\sum \limits _{i=1}^b \sum \limits _{j=t_i}^{n-b+t_i} K_h *K_g (x-X_j) - \hat{f}_g(x)\right] ^2 dx.\\ \end{aligned}$$

Furthermore, if \(b<n\)

$$\begin{aligned}&\dfrac{1}{b(n-b+1)}\sum \limits _{i=1}^b \sum \limits _{j=t_i}^{n-b+t_i} K_h *K_g (x-X_j)\\= & {} \dfrac{1}{n-b+1} \sum \limits _{j=b}^{n-b+1} K_h *K_g (x-X_j)+\dfrac{1}{b(n-b+1)} \sum \limits _{j=1}^{b-1}j(K_h*K_g) (x-X_j)\\&+\dfrac{1}{b(n-b+1)} \sum \limits _{j=n-b+2}^{n} (n-j+1) (K_h *K_g) (x-X_j)\\= & {} \sum \limits _{j=1}^{n}a_j (K_h*K_g)(x-X_j),\\ \end{aligned}$$

where \(a_j\) (10.3).

If \(b=n\),

$$\begin{aligned} \dfrac{1}{b(n-b+1)}\sum \limits _{i=1}^b \sum \limits _{j=t_i}^{n-b+t_i} K_h *K_g (x-X_j)= & {} \dfrac{1}{n}\sum \limits _{j=1}^{n} K_h*K_g(x-X_j)\\= & {} \sum \limits _{j=1}^{n} a_j \left( K_h*K_g\right) (x-X_j), \end{aligned}$$

considering \(a_j=\dfrac{1}{n}\), if \(b=n\).

Hence, carrying on with the calculations of the integrated bootstrap bias (including several changes of variable and using the symmetry of K) results in:

$$\begin{aligned} B^*(h)= & {} \int \left[ \sum \limits _{j=1}^{n}a_j \left( K_h *K_g \right) (x-X_j) - \hat{f}_g(x)\right] ^2 dx\\= & {} \int \left[ \sum \limits _{j=1}^{n}a_j \left( K_h *K_g \right) (x-X_j) - \dfrac{1}{n} \sum \limits _{j=1}^{n} K_g(x-X_j)\right] ^2 dx\\= & {} \sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}\int \left[ a_j \left( K_h *K_g \right) (x-X_j) - \dfrac{1}{n} K_g(x-X_j)\right] \\&\times \left[ a_k \left( K_h *K_g \right) (x-X_k) - \dfrac{1}{n} K_g(x-X_k)\right] dx\\= & {} \sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}\int \left[ a_j a_k \left( K_h *K_g \right) (x-X_j) \left( K_h*K_g\right) (x-X_k) \right. \\&- \left. \dfrac{2a_j}{n} \left( K_h *K_g\right) (x-X_j) K_g(x-X_k) + \dfrac{1}{n^2} K_g(x-X_j) K_g(x-X_k) \right] dx\\= & {} \sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}a_j a_k\int \left[ \left( K_h *K_g \right) (x-X_j) \left( K_h*K_g\right) (x-X_k)\right] dx \\&- \dfrac{2}{n}\sum \limits _{j=1}^{n}a_j \sum \limits _{k=1}^{n} \int \left[ \left( K_h *K_g\right) (x-X_j) K_g(x-X_k) \right] dx\\&+ \dfrac{1}{n^2}\sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}\int \left[ K_g(x-X_j) K_g(x-X_k) \right] dx\\= & {} \sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}a_j a_k\int \left[ \left( K_h *K_g \right) (-v) \left( K_h*K_g\right) (X_j-X_k-v)\right] dv \\&- \dfrac{2}{n} \sum \limits _{j=1}^{n}a_j \sum \limits _{k=1}^{n} \int \left[ \left( K_h *K_g\right) (-v) K_g(X_j-X_k-v) \right] dv\\&+ \dfrac{1}{n^2}\sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}\int \left[ K_g(-v) K_g(X_j-X_k-v) \right] dv\\= & {} \sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}a_j a_k\left[ \left( K_h *K_g \right) *\left( K_h*K_g\right) \right] (X_j-X_k) \\&- \dfrac{2}{n}\sum \limits _{j=1}^{n}a_j \sum \limits _{k=1}^{n}\left[ \left( K_h *K_g\right) *K_g\right] (X_j-X_k) + \dfrac{1}{n^2}\sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}\left[ K_g*K_g\right] (X_j-X_k).\\ \end{aligned}$$

Thus,

$$\begin{aligned} B^*(h)= & {} \sum \limits _{j=1}^{n} a_j \sum \limits _{k=1}^{n}a_k\left[ \left( K_h *K_g \right) *\left( K_h*K_g\right) \right] (X_j-X_k) \\ \nonumber&- \dfrac{2}{n}\sum \limits _{j=1}^{n} a_j \sum \limits _{k=1}^{n} \left[ \left( K_h *K_g\right) *K_g\right] (X_j-X_k) + \dfrac{1}{n^2}\sum \limits _{j=1}^{n}\sum \limits _{k=1}^{n}\left[ K_g*K_g\right] (X_j-X_k) . \end{aligned}$$
(10.7)

We now focus on the integrated bootstrap variance, which needs a deeper insight:

$$\begin{aligned} V^{*}\left( h\right)= & {} \int {Var}^{*}\left( n^{-1}\sum _{i=1}^nK_h\left( \nonumber x-X_i^{*}\right) \right) dx \\ \nonumber= & {} n^{-2}\int \sum \limits _{i=1}^{n}{Var}^{*}\left( K_h\left( x-X_i^{*}\right) \right) dx\\ \nonumber&+\,n^{-2}\sum \limits _{\begin{array}{c} i,j=1 \\ i\ne j \end{array}} ^n\int Cov^{*}\left( K_h\left( x-X_i^{*}\right) ,K_h\left( x-X_j^{*}\right) \right) dx\\\nonumber= & {} n^{-2}\sum \limits _{i=1}^{n}\int \mathbb {E}^{*}\left( K_h\left( x-X_i^{*}\right) ^2\right) dx\\ \nonumber&-\,n^{-2}\sum \limits _{i=1}^{n}\int \left[ \mathbb {E}^{*}\left( K_h\left( x-X_i^{*}\right) \right) \right] ^2 dx \\&+\,n^{-2}\sum \limits _{\begin{array}{c} i,j=1 \\ i\ne j \end{array}} ^n\int Cov^{*}\left( K_h\left( x-X_i^{*}\right) ,K_h\left( x-X_j^{*}\right) \right) dx.\\ \nonumber \end{aligned}$$
(10.8)

The first term in (10.8), after some changes of variable, is given by:

$$\begin{aligned}&n^{-2}\sum \limits _{i=1}^{n}\int \mathbb {E}^{*}\left( K_h\left( x-X_i^{*}\right) \nonumber ^2\right) dx = n^{-2}\sum \limits _{i=1}^{n}\int \left[ \int K_h (x-y)^2 \hat{f}_g^{(i)}(y)dy\right] dx\\ \nonumber= & {} n^{-2}\sum \limits _{i=1}^{n}\int \left[ \int K_h(x-y)^2\nonumber \left[ \dfrac{1}{n-b+1}\sum \limits _{j=t_i}^{n-b+t_i}K_g(y-X_j)\right] dy\right] dx\\ \nonumber= & {} \dfrac{1}{n^2(n-b+1)}\sum \limits _{i=1}^{n}\sum \limits _{j=t_i}^{n-b+t_i}\int K_g(y-X_j) \left[ \int K_h(x-y)^2 dx \right] dy\\\nonumber= & {} \dfrac{1}{n^2(n-b+1)}\sum \limits _{i=1}^{n}\sum \limits _{j=t_i}^{n-b+t_i}\int K_g(y-X_j) \left[ \dfrac{1}{h}\int K\left( z\right) ^2 dz \right] dy\\\nonumber= & {} \dfrac{R(K)}{n^2(n-b+1) h}\sum \limits _{i=1}^{n}\sum \limits _{j=t_i}^{n-b+t_i}\int K_g(y-X_j) dy\\\nonumber= & {} \dfrac{R(K)}{n^2(n-b+1) h}\sum \limits _{i=1}^{n}\sum \limits _{j=t_i}^{n-b+t_i}\int K\left( u\right) du= \dfrac{R(K)}{n h}.\\ \end{aligned}$$
(10.9)

Focusing now on the second term, including several changes of variable and using the symmetry of K:

$$\begin{aligned}&n^{-2}\sum \limits _{i=1}^{n}\int \left[ \mathbb {E}^{*}\left( K_h\left( x-X_i^{*}\right) \right) \right] ^2 dx = n^{-2}\sum \limits _{i=1}^n \int \left[ \int K_h(x-y) \hat{f}_g^{(i)}(y)dy\right] ^2 dx\\= & {} n^{-1}b^{-1} \sum \limits _{i=1}^{b} \int \left[ \left( K_h*\hat{f}_g^{(i)}\right) (x)\right] ^2 dx\\= & {} n^{-1}b^{-1}\sum \limits _{i=1}^{b} \int \left[ \sum \limits _{j=t_i}^{n-b+t_i}\dfrac{1}{n-b+1} (K_h*K_g)(x-X_j)\right] \\&\times \left[ \sum \limits _{k=t_i}^{n-b+t_i}\dfrac{1}{n-b+1} (K_h*K_g)(x-X_k)\right] dx\\= & {} \dfrac{1}{nb(n-b+1)^2}\sum \limits _{i=1}^{b} \sum \limits _{j=t_i}^{n-b+t_i}\sum \limits _{k=t_i}^{n-b+t_i} \int (K_h*K_g)(x-X_j) (K_h*K_g)(x-X_k)dx\\= & {} \dfrac{1}{nb(n-b+1)^2}\sum \limits _{i=1}^{b} \sum \limits _{j=t_i}^{n-b+t_i}\sum \limits _{k=t_i}^{n-b+t_i} \int (K_h*K_g)(v) (K_h*K_g)(X_j-X_k-v)dv\\= & {} \dfrac{1}{nb(n-b+1)^2}\sum \limits _{i=1}^{b} \sum \limits _{j=i}^{n-b+i}\sum \limits _{k=i}^{n-b+i} \left[ (K_h*K_g)*(K_h*K_g)\right] (X_j-X_k). \end{aligned}$$

Let us consider the function \(\psi \) defined in Theorem 1. Whenever \(b<n\), we have:

$$\begin{aligned}&n^{-2}\sum \limits _{i=1}^{n}\int \left[ \mathbb {E}^*\left( K_h(x-X_j^*)\right) \right] ^2 dx\nonumber \\ \nonumber= & {} \dfrac{1}{nb(n-b+1)^2}\sum \limits _{i=1}^{b} \sum \limits _{j=i}^{n-b+i}\sum \limits _{k=i}^{n-b+i} \left[ (K_h*K_g)*(K_h*K_g)\right] (X_j-X_k)\\\nonumber= & {} \dfrac{1}{nb(n-b+1)^2}\sum \limits _{i=1}^{b} \sum \limits _{j=i}^{n-b+i}\sum \limits _{k=i}^{n-b+i} \psi (X_j-X_k)\\\nonumber= & {} \dfrac{1}{nb(n-b+1)^2}\left[ \sum \limits _{i=1}^{b} \sum \limits _{j=i}^{b-1} \sum \limits _{k=i}^{b-1} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{i=1}^{b} \sum \limits _{j=i}^{b-1} \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k) + \sum \limits _{i=1}^{b} \sum \limits _{j=i}^{b-1} \sum \limits _{k=n-b+2}^{n-b+i} \psi (X_j-X_k)\right. \\ \nonumber&\left. + \sum \limits _{i=1}^{b} \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=i}^{b-1} \psi (X_j-X_k)+ \sum \limits _{i=1}^{b} \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k)\right. \\ \nonumber&\left. + \sum \limits _{i=1}^{b} \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=n-b+2}^{n-b+i} \psi (X_j-X_k)+ \sum \limits _{i=1}^{b} \sum \limits _{j=n-b+2}^{n-b+i} \sum \limits _{k=i}^{b-1} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{i=1}^{b} \sum \limits _{j=n-b+2}^{n-b+i} \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k) + \sum \limits _{i=1}^{b} \sum \limits _{j=n-b+2}^{n-b+i} \sum \limits _{k=n-b+2}^{n-b+i} \psi (X_j-X_k)\right] \\ \nonumber= & {} \dfrac{1}{nb(n-b+1)^2}\left[ \sum \limits _{j=1}^{b-1} \sum \limits _{k=1}^{b-1}\nonumber \sum \limits _{i=1}^{\min \{j,k\}} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{j=1}^{b-1} \sum \limits _{k=b}^{n-b+1}\sum \limits _{i=1}^{j} \psi (X_j-X_k) + \sum \limits _{j=1}^{b-1}\sum \limits _{k=n-b+2}^{n} \sum \limits _{i=\max \{(k+b-n),1\}}^{j} \psi (X_j-X_k)\right. \\ \nonumber&\left. + \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=1}^{b-1}\sum \limits _{i=1}^{k} \psi (X_j-X_k)+ \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=b}^{n-b+1} \sum \limits _{i=1}^{b} \psi (X_j-X_k)\right. \\\nonumber&+ \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=n-b+2}^{n}\sum \limits _{i=\max \{(k-n+b),1\}}^{b} \psi (X_j-X_k) \\ \nonumber&+ \sum \limits _{j=n-b+2}^{n} \sum \limits _{k=1}^{b-1} \sum \limits _{i=\max \{(j+b-n),1\}}^{k} \psi (X_j-X_k)\\ \nonumber&\left. + \sum \limits _{j=n-b+2}^{n} \sum \limits _{k=b}^{n-b+1}\sum \limits _{i=\max \{(j-n+b),1\}}^{b} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{j=n-b+2}^{n} \sum \limits _{k=n-b+2}^{n} \nonumber \sum \limits _{i=\max \{(j-n+b),(k-n+b)\}}^{b} \psi (X_j-X_k)\right] \\ \nonumber= & {} \dfrac{1}{nb(n-b+1)^2}\left[ \sum \limits _{j=1}^{b-1} \sum \limits _{k=1}^{b-1} \min \{j,k\} \psi (X_j-X_k)\right. \\\nonumber&+ \sum \limits _{j=1}^{b-1}j \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k)\\\nonumber&+ \sum \limits _{j=1}^{b-1}\sum \limits _{k=n-b+2}^{n} \min \{(n-b+j-k+1),j\} \psi (X_j-X_k) \\ \nonumber&\left. + \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=1}^{b-1}k \psi (X_j-X_k)+ b \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=n-b+2}^{n} \min \{(n-k+1),b\} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{j=n-b+2}^{n} \sum \limits _{k=1}^{b-1} \min \{(n-b+k-j+1),k\} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{j=n-b+2}^{n} \min \{(n-j+1),b\} \nonumber \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k)\right. \\ \nonumber&\left. + \sum \limits _{j=n-b+2}^{n} \sum \limits _{k=n-b+2}^{n} \min \{(n-j+1),(n-k+1)\} \psi (X_j-X_k)\right] \\\nonumber= & {} \dfrac{1}{nb(n-b+1)^2}\left[ \sum \limits _{j=1}^{b-1} \sum \limits _{k=1}^{b-1}\nonumber \min \{j,k\} \psi (X_j-X_k)\right. \\ \nonumber&+ \sum \limits _{j=1}^{b-1}j \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k) \\ \nonumber&+ \sum \limits _{j=1}^{b-1}\sum \limits _{k=n-b+2}^{n} \min \{(n-b+j-k+1),j\} \psi (X_j-X_k) \\ \nonumber&\left. + \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=1}^{b-1}k \psi (X_j-X_k)+ b \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k)\right. \\\nonumber&\left. + \sum \limits _{j=b}^{n-b+1} \sum \limits _{k=n-b+2}^{n} \min \{(n-k+1),b\} \psi (X_j-X_k)\right. \\ \nonumber&\left. + \sum \limits _{j=n-b+2}^{n} \sum \limits _{k=1}^{b-1} \min \{(n-b+k-j+1),k\} \psi (X_j-X_k)\right. \\ \nonumber&\left. + \sum \limits _{j=n-b+2}^{n} \min \{(n-j+1),b\} \sum \limits _{k=b}^{n-b+1} \psi (X_j-X_k)\right. \\&\left. + \sum \limits _{j=n-b+2}^{n} \sum \limits _{k=n-b+2}^{n} \left( n+1-\max \{j,k\}\right) \psi (X_j-X_k)\right] . \end{aligned}$$
(10.10)

On the other hand, if \(b=n\):

$$\begin{aligned}&\dfrac{1}{nb(n-b+1)^2}\sum \limits _{i=1}^b\sum \limits _{j=i}^{n-b+i}\sum \limits _{k=i}^{n-b+i}\psi \left( X_j-X_k\right) = \dfrac{1}{n^2}\sum \limits _{i=1}^n \psi (X_i-X_i)= \dfrac{\psi (0)}{n}. \end{aligned}$$

Finally, we investigate the covariance term further. It is now necessary to take into account the following notation, naming the n / b blocks as follows:

$$J_r=\left\{ (r-1)b+1,(r-1)b+2,\ldots ,rb\right\} , r=1,2,\ldots ,n/b.$$

Thus, \(X_i^*\) and \(X_j^*\) turn out to be independent (in the bootstrap universe) whenever it does not exist \(r \in \{1,2,\ldots ,n/b\}\) which satisfies \(i,j\in J_r\). In that case, \(X_i^{*}\) and \(X_j^{*}\) do not belong to the same bootstrap block, implying:

$$Cov^*\left( \left. K_h(x-X_i^*),K_h(x-X_j^*)\right. \right) =0.$$

On the other hand, if there exists \(r \in \{1,2,\ldots ,n/b\}\) satisfying \(i,j\in J_r\), then the bootstrap distribution of the pair \((X_i^*,X_j^*)\) is exactly identical to that of the pair \((X_{t_i}^*,X_{t_j}^*)\), where \(t_i =[(i-1)\text {mod }b]+1\). Let us consider \(r \in \{1,2,\ldots ,n/b\}\) satisfying \(i,j \in J_r\), then \(X_i^{*}\) e \(X_j^{*}\) belong to the same bootstrap block. As a consequence,

$$Cov^*\left( \left. K_h(x-X_i^*),K_h(x-X_j^*)\right. \right) = Cov^*\left( K_h(x-X_{t_i}^*), K_h(x-X_{t_j}^*)\right) .$$

Thus:

$$Cov^*\left( K_h(x-X_i^*), K_h(x-X_j^*)\right) = \left\{ \begin{array}{lll} Cov^*\left( K_h(x-X_{t_i}^*), K_h(x-X_{t_j}^*)\right) , &{} \text {if}\,\, \exists r/ i,j \in J_r\\ \\ 0, &{} \text {otherwise} \end{array}. \right. $$

Notice that: \(\mathbb {E}^*\left[ \left. K_h\left( x-X_i^*\right) \right. \right] =\left( K_h*\hat{f}_g^{(i)}\right) , \text { and}\) \(\mathbb {E}^*\left[ \left. K_h\left( x-X_j^*\right) \right. \right] =\left( K_h*\hat{f}_g^{(j)}\right) . \) Now, consider \(k,\ell \in \{1,2,\ldots ,b\}\) satisfying \(k < \ell \). Carrying on with the calculations of the covariance term and using:

$$\mathbb {P}^*\left( \left( X_k^{*(d)},X_{\ell }^{*(d)}\right) =\left( X_j,X_{j+\ell -k}\right) \right) =\dfrac{1}{n-b+1}, j=k,k+1,\ldots ,n-b+k,$$

leads to:

$$\begin{aligned}&\dfrac{1}{n^2} \sum \limits _{\begin{array}{c} i,j=1 \\ i \ne j \end{array}}^{n} Cov^{*}\left( K_{h}\left( x-X_{i}^{*}\right) ,K_{h}\left( x-X_{j}^{*}\right) \right) \\ \!= & {} \! \dfrac{1}{n^2} \dfrac{n}{b} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k\ne \ell \end{array}}^{b} Cov^{*}\left( K_{h}\left( x-X_{k}^{*}\right) ,K_{h}\left( x-X_{\ell }^{*}\right) \right) \\ \!= & {} \! \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} Cov^{*}\left( K_{h}\left( x-X_{k}^{*}\right) ,K_{h}\left( x-X_{\ell }^{*}\right) \right) \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \left[ \mathbb {E}^{*}\left( K_{h}\left( x-X_{k}^{*}\right) K_{h}\left( x-X_{\ell }^{*}\right) \right) -\mathbb {E}^{*}\left( K_{h}\left( x-X_{k}^{*}\right) \right) \mathbb {E}^{*}\left( K_{h}\left( x-X_{\ell }^{*}\right) \right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \left[ \mathbb {E}^{*}\left[ \mathbb {E}^{*}\left( \left. K_{h}\left( x-X_{k}^{*}\right) K_{h}\left( x-X_{\ell }^{*}\right) \right| _{U_{k}^{*},U_{\ell }^{*}}\right) \right] - \left( K_h *\hat{f}_g^{(k)}\right) \left( K_h *\hat{f}_g^{(\ell )}\right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \left[ \mathbb {E}^{*}\left[ \mathbb {E}^{*}\left( \left. K_{h}\left( x-X_{k}^{*(d) }-gU_k^*\right) K_{h}\left( x-X_{\ell }^{*(d)}-gU_{\ell }^*\right) \right| _{U_{k}^{*},U_{\ell }^{*}}\right) \right] \right. \\&\left. - \left( K_h *\hat{f}_g^{(k)}(x)\right) \left( K_h *\hat{f}_g^{(\ell )}(x)\right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} \mathbb {E}^{*}\left[ K_{h}\left( x-X_{j}-gU_k^*\right) K_{h}\left( x-X_{j+\ell -k}-gU_{\ell }^*\right) \right] \right. \\&\left. - \left( K_h *\hat{f}_g^{(k)}(x)\right) \left( K_h *\hat{f}_g^{(\ell )}(x)\right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} \int \int K_h(x-X_j-gu) K_h(x-X_{j+\ell -k}-gv) K(u) K(v) du dv\right. \\&\left. - \left( K_h *\hat{f}_g^{(k)}(x)\right) \left( K_h *\hat{f}_g^{(\ell )}(x)\right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} \int \int K_h(x-X_j-s) K_h(x-X_{j+\ell -k}-t) K_g(s) K_g(t) ds dt\right. \\&\left. - \left( K_h *\hat{f}_g^{(k)}(x)\right) \left( K_h *\hat{f}_g^{(\ell )}(x)\right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} (K_h*K_g)(x-X_j) (K_h *K_g)(x-X_{j+\ell -k})\right. \\&\left. - \left( K_h *\hat{f}_g^{(k)}(x)\right) \left( K_h *\hat{f}_g^{(\ell )}(x)\right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} (K_h*K_g)(x-X_j) (K_h *K_g)(x-X_{j+\ell -k})\right. \\&\left. - \left( \dfrac{1}{n-b+1}\sum \limits _{i=k}^{n-b+k}\int K_h(x-y) K_g(y-X_i)dy\right) \right. \\&\left. \times \left( \dfrac{1}{n-b+1}\sum \limits _{j=\ell }^{n-b+\ell }\int K_h(x-y) K_g(y-X_j)dy \right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} (K_h*K_g)(x-X_j) (K_h *K_g)(x-X_{j+\ell -k})\right. \\&\left. - \left( \dfrac{1}{n-b+1}\sum \limits _{i=k}^{n-b+k}\int K_h(x-X_i-u) K_g(u)du\right) \right. \\&\left. \times \left( \dfrac{1}{n-b+1}\sum \limits _{j=\ell }^{n-b+\ell }\int K_h(x-X_j-u) K_g(u)du \right) \right] \\= & {} \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k < \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} (K_h*K_g)(x-X_j) (K_h *K_g)(x-X_{j+\ell -k})\right. \\&\left. - \dfrac{1}{(n-b+1)^2}\sum \limits _{i=k}^{n-b+k}\sum \limits _{j=\ell }^{n-b+\ell } (K_h*K_g)(x-X_i) (K_h*K_g)(x-X_j) \right] . \\ \end{aligned}$$

The integral with respect to x is now computed (using some changes of variable and the symmetry of the kernel K):

$$\begin{aligned}&\int \dfrac{1}{n^2} \sum \limits _{\begin{array}{c} i,j=1 \\ i \ne j \end{array}}^{n} Cov^*\left( K_h\left( x-X_i^*\right) , K_h\left( x-X_j^*\right) \right) dx\\= & {} \int \left[ \dfrac{2}{nb} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} (K_h*K_g)(x-X_j) (K_h *K_g)(x-X_{j+\ell -k})\right. \right. \\&\left. \left. - \dfrac{1}{(n-b+1)^2}\sum \limits _{i=k}^{n-b+k}\sum \limits _{j=\ell }^{n-b+\ell } (K_h*K_g)(x-X_i) (K_h*K_g)(x-X_j) \right] \right] dx\\= & {} \dfrac{2}{nb} \left[ \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b}\left[ \dfrac{1}{n-b+1}\sum \limits _{j=k}^{n-b+k} \int (K_h*K_g)(X_{j+\ell -k}-X_j-u) (K_h *K_g)(u)du\right. \right. \\&\left. \left. - \dfrac{1}{(n-b+1)^2}\sum \limits _{i=k}^{n-b+k}\sum \limits _{j=\ell }^{n-b+\ell } \int (K_h*K_g)(u) (K_h*K_g)(X_i-X_j-u)du \right] \right] \\= & {} \dfrac{2}{nb(n-b+1)} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{j=k}^{n-b+k}\left[ (K_h*K_g)*(K_h*K_g)\right] (X_{j+\ell -k}-X_j)\\&- \dfrac{2}{nb(n-b+1)^2}\sum \limits _{\begin{array}{c} k,\ell =1 \\ k < \ell \end{array}}^{b} \sum \limits _{i=k}^{n-b+k}\sum \limits _{j=\ell }^{n-b+\ell } \left[ (K_h*K_g)*(K_h*K_g)\right] (X_{i}-X_j). \end{aligned}$$

Notice that, whenever \(b<n\):

$$\begin{aligned}&\sum \limits _{\begin{array}{c} k,\ell =1 \\ k < \ell \end{array}}^{b} \sum \limits _{j=k}^{n-b+k}\left[ (K_h*K_g)*(K_h*K_g)\right] (X_{j+\ell -k}-X_j)\\= & {} \sum \limits _{k=1}^{b-1} \sum \limits _{j=k}^{n-b+k}\sum \limits _{\ell =k+1}^{b} \left[ (K_h*K_g)*(K_h*K_g)\right] (X_{j+\ell -k}-X_j)\\= & {} \sum \limits _{k=1}^{b-1} \sum \limits _{j=k}^{n-b+k}\sum \limits _{s=1}^{b-k} \left[ (K_h*K_g)*(K_h*K_g)\right] (X_{j+s}-X_j)\\= & {} \sum \limits _{s=1}^{b-1} \sum \limits _{j=1}^{n-s}\sum \limits _{k=\max \{1,j+b-n\}}^{\min \{j,b-s\}} \left[ (K_h*K_g)*(K_h*K_g)\right] (X_{j+s}-X_j)\\= & {} \sum \limits _{s=1}^{b-1} \sum \limits _{j=1}^{n-s} \left( \min \{j,b-s\}-\max \{1,j+b-n\}+1\right) \left[ (K_h*K_g)*(K_h*K_g)\right] (X_{j+s}-X_j). \end{aligned}$$

Now, using the function \(\psi \), and considering \(b<n\), we have:

$$\begin{aligned}&n^{-2}\sum \limits _{\begin{array}{c} i,j=1 \\ i\ne j \end{array}}^{n}\int Cov^*\nonumber \left( K_h(x-X_i^*),K_h(x-X_j^*)\right) dx \nonumber \\ \nonumber= & {} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{n-b+k}\sum \limits _{j=\ell }^{n-b+\ell } \psi (X_i-X_j)\\ \nonumber= & {} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=\ell }^{b-1} \psi (X_i-X_j) + \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=b}^{n-b+2} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=n-b+3}^{n-b+\ell } \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \nonumber \sum \limits _{i=b-1}^{n-b+1}\sum \limits _{j=n-b+3}^{n-b+\ell } \psi (X_i-X_j) + \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=\ell }^{b-1} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=b}^{n-b+2}\psi (X_i-X_j) +\sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=n-b+3}^{n-b+\ell } \psi (X_i-X_j)\\ \nonumber= & {} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=\ell }^{b-1} \psi (X_i-X_j) + \sum \limits _{k=1}^{b-1}\sum \limits _{\ell =k+1}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=b}^{n-b+2} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=n-b+3}^{n-b+\ell } \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\ell =2}^{b}\sum \limits _{k=1}^{\ell -1} \nonumber \sum \limits _{i=b-1}^{n-b+1}\sum \limits _{j=\ell }^{b-1}\psi (X_i-X_j) + \sum \limits _{i=b-1}^{n-b+1}\sum \limits _{j=b}^{n-b+2}\sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\ell =2}^{b} \sum \limits _{k=1}^{\ell -1}\sum \limits _{i=b-1}^{n-b+1} \sum \limits _{j=n-b+3}^{n-b+\ell }\psi (X_i-X_j) + \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=\ell }^{b-1} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{k=1}^{b-1}\sum \limits _{\ell =k+1}^{b}\nonumber \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=b}^{n-b+2}\psi (X_i-X_j) \\ \nonumber&+\sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=n-b+3}^{n-b+\ell } \psi (X_i-X_j)\\ \nonumber= & {} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=\ell }^{b-1} \psi (X_i-X_j) + \sum \limits _{k=1}^{b-1}(b-k) \sum \limits _{i=k}^{b-2}\sum \limits _{j=b}^{n-b+2} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=k}^{b-2}\sum \limits _{j=n-b+3}^{n-b+\ell } \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\ell =2}^{b}(\ell -1) \sum \limits _{i=b-1}^{n-b+1}\sum \limits _{j=\ell }^{b-1}\psi (X_i-X_j) + \dfrac{b(b-1)}{2}\sum \limits _{i=b-1}^{n-b+1}\sum \limits _{j=b}^{n-b+2} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{\ell =2}^{b}(\ell -1) \sum \limits _{i=b-1}^{n-b+1} \sum \limits _{j=n-b+3}^{n-b+\ell }\psi (X_i-X_j) + \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=\ell }^{b-1} \psi (X_i-X_j)\\ \nonumber&+ \sum \limits _{k=1}^{b-1}(b-k) \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=b}^{n-b+2}\psi (X_i-X_j) \\&+\sum \limits _{\begin{array}{c} k,\ell =1 \\ k < \ell \end{array}}^{b} \sum \limits _{i=n-b+2}^{n-b+k}\sum \limits _{j=n-b+3}^{n-b+\ell } \psi (X_i-X_j). \end{aligned}$$
(10.11)

On the other hand, if \(b=n\) and using the symmetry of the kernel K, we obtain:

$$\begin{aligned}&\dfrac{2}{nb(n-b+1)} \sum \limits _{\begin{array}{c} k,\ell =1 \\ k< \ell \end{array}}^{b} \nonumber \sum \limits _{j=k}^{n-b+k}\left[ (K_h*K_g)*(K_h*K_g)\right] (X_{j+\ell -k}-X_j) \nonumber \\ \nonumber&- \dfrac{2}{nb(n-b+1)^2}\sum \limits _{\begin{array}{c} k,\ell =1 \\ k < \ell \end{array}}^{b} \sum \limits _{i=k}^{n-b+k}\sum \limits _{j=\ell }^{n-b+\ell } \left[ (K_h*K_g)*(K_h*K_g)\right] (X_{i}-X_j)\\= & {} \dfrac{2}{n^2} \sum \limits _{k=1}^{n-1}\sum \limits _{\ell =k+1}^{n} \psi \left( X_\ell -X_k\right) - \dfrac{2}{n^2} \sum \limits _{k=1}^{n-1}\sum \limits _{\ell =k+1}^{n} \psi \left( X_k-X_\ell \right) =0. \end{aligned}$$
(10.12)

Using (10.9), (10.10) and (10.11) in (10.8), and this and (10.7) in (10.6) gives the statement of Theorem 1 for \(b<n\). The case \(b=n\) is even simpler using (10.12).

Rights and permissions

Reprints and Permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Barbeito, I., Cao, R. (2017). A Review and Some New Proposals for Bandwidth Selection in Nonparametric Density Estimation for Dependent Data. In: Ferger, D., González Manteiga, W., Schmidt, T., Wang, JL. (eds) From Statistics to Mathematical Finance. Springer, Cham. https://doi.org/10.1007/978-3-319-50986-0_10

Download citation