Skip to main content
Log in

Asymptotic normality of some conditional nonparametric functional parameters in high-dimensional statistics

  • Original Paper
  • Published:
Behaviormetrika Aims and scope Submit manuscript

Abstract

This paper deals with the convergence in distribution of estimators of some conditional parameters in the Functional Data Analysis framework. In fact, we consider models where the input is of functional kind and the output is a scalar. Then, we establish the asymptotic normality of the nonparametric local linear estimators of (1) the conditional distribution function and (2) the successive derivatives of the conditional density. Moreover, as by-product, we deduce the asymptotic normality of the local linear estimator of the conditional mode. Finally, to show interests of our results, on the practical point of view, we have conducted a computational study, first on a simulated data and, then on some real data concerning the forage quality.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. Let \((z_{n})_{n\in{{\mathbb{N}}}}\) be a sequence of real random variables. We say that \((z_n)\) converges almost-completely (a.co.) toward zero if, and only if, for all \(\epsilon > 0\), \(\sum _{n=1}^\infty {I\!\!P}(|z_n|>\epsilon ) < \infty\). Moreover, we say that the rate of the almost-complete convergence of \((z_n)\) to zero is of order \(u_n\) (with \(n\rightarrow 0)\) and we write \(z_n = O_{\text {a.co.}}(u_n)\) if, and only if, there exists \(\epsilon > 0\) such that \(\sum _{n=1}^\infty {I\!\!P}(|z_n|>\epsilon u_n) < \infty\). This kind of convergence implies both the almost-sure convergence and the convergence in probability.

References

  • Baìllo A, Grané A (2009) Local linear regression for functional predictor and scalar response. J Multivar Anal 100:102–111

    Article  MathSciNet  MATH  Google Scholar 

  • Barrientos J, Ferraty F, Vieu P (2010) Locally modelled regression and functional data. J Nonparametric Stat 22:617–632

    Article  MathSciNet  MATH  Google Scholar 

  • Benhenni K, Ferraty F, Rachdi M, Vieu P (2007) Local smoothing regression with functional data. Comput Stat 22:353–369

    Article  MathSciNet  MATH  Google Scholar 

  • Berlinet A, Elamine A, Mas A (2011) Local linear regression for functional data. Inst Stat Math 63:1047–1075

    Article  MathSciNet  MATH  Google Scholar 

  • Boj E, Delicado P, Fortiana J (2010) Distance-based local linear regression for functional predictors. Comput Stat Data Anal 54:429–437

    Article  MathSciNet  MATH  Google Scholar 

  • Chen J, Zhu R, Xu R, Zhang W, Shen Y, Zhang Y (2015) Evaluation of Leymus chinensis quality using near-infrared reflectance spectroscopy with three different statistical analyses. Peer J 3:e1416. https://doi.org/10.7717/peerj.1416

  • Delsol L (2009) Advances on asymptotic normality in nonparametric functional time series analysis. Statistics 43:13–33

    Article  MathSciNet  MATH  Google Scholar 

  • Demongeot J, Laksaci A, Madani F, Rachdi M (2013) Functional data: local linear estimation of the conditional density and its application. Statistics 47:26–44

    Article  MathSciNet  MATH  Google Scholar 

  • Demongeot J, Laksaci A, Rachdi M, Rahmani S (2014) On the local linear modelization of the conditional distribution for functional data. Sankhya 76:328–355

    Article  MathSciNet  MATH  Google Scholar 

  • El Methni M, Rachdi M (2011) Local weighted average estimation of the regression operator for functional data. Commun Stat Theory Methods 40:3141–3153

  • Ezzahrioui M, Ould-Saïd E (2008a) Asymptotic normality of a nonparametric estimator of the conditional mode function for functional data. J Nonparametric Stat 20:3–18

    Article  MathSciNet  MATH  Google Scholar 

  • Ezzahrioui M, Ould-Saïd E (2008b) Asymptotic normality of the kernel estimator of conditional quantiles in a normed space. Far East J Theor Stat 25:15–38

    MathSciNet  MATH  Google Scholar 

  • Ezzahrioui M, Ould-Saïd E (2008c) Asymptotic results of a nonparametric conditional quantile estimator for functional time series. Commun Stat Theory Methods 37:2735–2759

    Article  MathSciNet  MATH  Google Scholar 

  • Ferraty F, Mas A, Vieu P (2007) Nonparametric regression on functional data: inference and practical aspects. Aust N Z J Stat 49:267–286

    Article  MathSciNet  MATH  Google Scholar 

  • Ferraty F, Laksaci A, Vieu P (2006a) Estimation of some characteristics of the conditional distribution in nonparametric functional models. Stat Inf Stoch Process 9:47–76

    Article  MATH  Google Scholar 

  • Ferraty F, Vieu P (2006b) Nonparametric functional data analysis: theory and Practice. Springer Series in Statistics, New York

    MATH  Google Scholar 

  • Ferraty F, Laksaci A, Tadj A, Vieu P (2011) Kernel regression with functional response. Electron J Stat 5:159–171

    Article  MathSciNet  MATH  Google Scholar 

  • Kara-Zaïtri L, Laksaci A, Rachdi M, Vieu P (2017a) Uniform in the smoothing parameter consistency results in functional regression. In: Aneiros G, Bongiorno GE, Cao R, Vieu P (eds) Functional statistics and related fields. Contributions to statistics. Springer, Cham. ISBN: 978-3-319-55845-5, pp 161–167

  • Kara-Zaïtri L, Laksaci A, Rachdi M, Vieu P (2017b) Data-driven kNN estimation in nonparametric functional data analysis. J Multivar Anal 153:176–188

    Article  MATH  Google Scholar 

  • Kudraszow NL, Vieu P (2013) Uniform consistency of kNN regressors for functional variables. Stat Probab Lett 83:1863–1870

    Article  MATH  Google Scholar 

  • Laksaci A, Rachdi M, Rahmani S (2013) Spatial modelization: local linear estimation of the conditional distribution for functional data. Spat Stat 6:1–23

    Article  Google Scholar 

  • Louani D, Ould-Saïd E (1999) Asymptotic normality of kernel estimators of the conditional mode under strong mixing hypothesis. J Nonparametric Stat 11:413–442

    Article  MathSciNet  MATH  Google Scholar 

  • Masry E (2005) Nonparametric regression estimation for dependent functional data: asymptotic normality. Stoch Proc Appl 115:155–177

    Article  MathSciNet  MATH  Google Scholar 

  • Rachdi M, Laksaci A, Demongeot J, Abdali A, Madani F (2014) Theoretical and practical aspects on the quadratique error in the local linear estimation of the conditional density for functional. Comput Stat Data Anal 73(2):53–68

  • Ramsay J, Silverman B (2005) Functional data analysis. Springer Series in Statistics, Springer, New York

    MATH  Google Scholar 

  • Zhou Z, Lin ZY (2016) Asymptotic normality of locally modelled regression estimator for functional data. J Nonparametric Stat 28(1):116–131

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mustapha Rachdi.

Additional information

Communicated by: Hidetoshi Matsui.

Appendix

Appendix

Proof of Lemma 6.2

By applying the Bienaymé–Tchebychev’s inequality, as \(n\rightarrow \infty\), we obtain, for all \(\varepsilon >0\)

$$\begin{aligned} {I\!\!P}(\mid \widehat{F}^{x}_D-{\text {IE}}(\widehat{F}^{x}_D)\mid >\varepsilon )\leqslant & {} \frac{\mathrm{var}(\widehat{F}^{x}_D)}{\varepsilon ^{2}} \leqslant \varepsilon ^{-2}O\left( \frac{1}{n\phi _{x}(h_{K})}\right) \rightarrow 0. \end{aligned}$$

\(\square\)

Proof of Lemma 6.5

We start by writing

$$\begin{aligned} {\text {IE}}\left( \widehat{f}^{x(j)}_{N}(y)\right)= & {} \frac{n}{nh^{j+1}_{H}E(\Delta _{1}K_{1})} {\text {IE}}\left( \Delta _{1}K_{1}H^{(j+1)}_{1}\right) \nonumber \\= & {} \frac{1}{h^{j+1}_{H}{\text {IE}}\left( (n-1)W_{12}\right) } \; {\text {IE}}\left( (n-1)W_{12} \; {\text {IE}}\left( H^{(j+1)}_{1}\big |X_{1}\right) \right) , \end{aligned}$$
(20)

where

$$\begin{aligned} {\text {IE}}\left( H^{(j+1)}_{1}\big | X_{1}\right) \; = \; h^{j+1}_{H}\int _{{\mathbb {R}}} H^{(1)}(t)\; f^{X_{1}(j)} (y-h_{H}t) \; {\text {d}}t. \end{aligned}$$

Then, using a Taylor expansion, of order two, of \(f^{X_{1}(j)}\) and under assumption (H2’), we obtain

$$\begin{aligned} {\text {IE}}\left( H^{(j+1)}_{1}\big |X_{1}\right) =h^{j+1}_{H}\left( f^{X_{1}(j)}(y)+ \frac{h_{\scriptstyle {H}}^2 t^2}{2}\int H^{(1)}_{1}(t){\text {d}}t\; \frac{\partial ^{j+2} f^{X_{1}}(y)}{\partial y^{j+2}} + o(h_H^2)\right) . \end{aligned}$$

Then

$$\begin{aligned} {\text {IE}}\left( \widehat{f}^{x(j)}_{N}(y)\right)= & \frac{1}{{\text {IE}}\left( W_{12}\right) } \left( {\text {IE}}\left( W_{12}\,\, f^{X_{1}(j)} (y)\right) \right. \\&\left. + \frac{h_{\scriptstyle {H}}^2 t^2}{2}\int H^{(1)}_{1}(t){\text {d}}t\ \ {\text {IE}}\left( W_{12}\frac{\partial ^{j+2} f^{X_{1}}(y)}{\partial y^{j+2}}\right) + o(h_H^2)\right) , \end{aligned}$$

and by assumption (H1’), we get

$$\begin{aligned} {\text {IE}}\left( \widehat{f}^{x(j)}_{N}(y)\right)= & {} \frac{1}{{\text {IE}}\left( W_{12}\right) } \big ( {\text {IE}}\left( W_{12}\varphi _{j}(X_{1},y)\right) \\&+ \frac{h_{\scriptstyle {H}}^2 t^2}{2}\int H^{(1)}_{1}(t){\text {d}}t \, {\text {IE}}\left( W_{12} \varphi _{j+2}(X_{1},y)\right) + o(h_H^2)\big ). \end{aligned}$$

Since \({\text {IE}}\left( \beta ^{2}_{1}W_{12}\right) =0\), from assumption (H1’) with \(l\in \{j,j+2,\}\), and the fact that \(\psi _{l}(0) =0\), we obtain

$$\begin{aligned}&{{\text {IE}}\left( W_{12}\varphi _{l}(X_{1},y)\right) }\\&\quad = \varphi _{l}(x,y){\text {IE}}\left( W_{12}\right) \;+\; {\text {IE}}\left( W_{12} {\text {IE}}\left( \varphi _{l}(X_{1},y) - \varphi _{l}(x,y)\big | \beta ( X_{1}, x)\right) \right) ,\\&\quad = \varphi _{l}(x, y) {\text {IE}}\left( W_{12}\right) + \frac{1}{2}\psi _{l}^{(2)} (0){\text {IE}}\left( W_{12} (\beta ( X_{1}, x)^{2})\right) +o\left( {\text {IE}}\left( W_{12} (\beta ( X_{1}, x))\right) \right) . \end{aligned}$$

Thus

$$\begin{aligned}&{{\text {IE}}\left( \widehat{f}^{x(j)}_{N}(y)\right) }\\&\quad = f^{x(j)}(y) \; +\; \frac{h_{H}^2}{2} \; \frac{\partial ^{j+2} f^{x}(y)}{\partial y^{j+2}} \; \int t^2 H^{(1)}(t) {\text {d}}t \; + \; o \left( h_{H}^{2} \; \frac{{\text {IE}}\left( \beta ( X_{1}, x) W_{12}\right) }{{\text {IE}}\left( W_{12}\right) } \right) \\&\qquad + \psi ^{(2)}_{j} (0) \; \frac{{\text {IE}}\left( \beta (X_{1}, x) W_{12}\right) }{{\text {IE}}\left( W_{12}\right) } \; +\; o \left( \frac{{\text {IE}}\left( \beta (X_{1}, x) W_{12}\right) }{{\text {IE}}\left( W_{12}\right) } \right) . \end{aligned}$$

Elsewhere, it is clear that \({\text {IE}}\left( \beta ^{2}(X_{1}, x) W_{12}\right) ={\text {IE}}\left( K_{1}\beta _{1}^{2}\right) ^{2}-{\text {IE}}\left( K_{1}\beta _{1}\right) {\text {IE}}\left( K_{1}\beta _{1}^{3}\right)\). Then, using Lemma 6.1’s result, we find

$$\begin{aligned} {\text {IE}}\left( \beta ^{2}(X_{1}, x) W_{12}\right)= & {} N(1,2)^{2}h_{K}^{4}\phi _{x}(h_{K})^{2}-o(h_{K}\phi _{x}(h_{K}))O(h_{K}^{3}\phi _{x}(h_{K}),\end{aligned}$$
(21)
$$\begin{aligned} \mathrm{and }\quad {\text {IE}}\left( W_{12}\right)= & {} N(1,2)M_{1}h_{K}^{2}\phi ^{2}_{x}(h_{K}). \end{aligned}$$
(22)

Finally, by combining (21) and (22) we obtain

$$\begin{aligned} \frac{{\text {IE}}\left( \beta ^{2} (X_{2}, x) W_{12}\right) }{{\text {IE}}\left( W_{12}\right) } \; = \; h_{K}^{2} \; \frac{N(1, 2) }{M_{1}} + o(h^{2}_{K}). \end{aligned}$$

\(\square\)

Proofs of Lemmas 6.3 and 6.7

We have

$$\begin{aligned}&{\frac{1}{h^{l}_{H}}{\text {IE}}\left( K^{2}_{1}\mathrm{var}\left( H^{(j+1)}\left( \frac{y-Y_{1}}{h}\right) \big |X_{1}\right) \right) }\nonumber \\&\quad =\frac{1}{h^{l}_{H}}{\text {IE}}\left( K_{1}^{2}{\text {IE}}\left( \left( H^{(j+1)}\left( \frac{y-Y_{1}}{h_{H}}\right) \right) ^{2}\big | X_{1}\right) \right) \end{aligned}$$
(23)
$$\begin{aligned}&\qquad -h^{l}_{H}{\text {IE}}\left( K_{1}^{2}{\text {IE}}^{2}\left( \frac{1}{h^{l}_{H}}H^{(j+1)}\left( \frac{y-Y_{1}}{h_{H}}\right) \big | X_{1}\right) \right) . \end{aligned}$$
(24)

Concerning the term (23), under assumptions (H5) (for \(j=-1\) and \(l=0\)) and (H7) (for \(j\ge 0\) and \(l=1\)) and by an integration by parts followed by a change of variable, we get

$$\begin{aligned} \frac{1}{h^{l}_{H}}{\text {IE}}\left( \left( H^{(j+1)}\left( \frac{y-Y_{1}}{h_{H}}\right) \right) ^{2}\big | X_{1}\right)= & {} \int _{{\mathbb {R}}}\left( H^{(j+1)}\left( \frac{y-z}{h_{H}}\right) \right) ^{2} f^{X_{1}}(z){\text {d}}z \\= & {} h^{l-1}_{H}\int _{{\mathbb {R}}}(H^{(j+1)}(t))^{2}f^{X_{1}}(y-th_{H}){\text {d}}t\\= & {} h^{l-1}_{H}\int _{{\mathbb {R}}}(H^{(j+1)}(t))^{2}dF^{X_{1}}(y-th_{H}){\text {d}}t. \end{aligned}$$

\(\square\)

Proof of Lemma 6.3

For \(j=-1\) and \(l=0\), by noting \(H^{(0)}\left( \frac{y-Y_{1}}{h_{H}}\right) =H\left( \frac{y-Y_{1}}{h_{H}}\right)\), we have

$$\begin{aligned} {\text {IE}}\left( H^{2}\left( \frac{y-Y_{1}}{h_{H}}\right) \big | X_{1}\right)= & {} h^{-1}_{H}\int _{{\mathbb {R}}}H^{2}(t)dF^{(X_{1})}(y-th_{H})\\= & {} \int _{{\mathbb {R}}}2H(t)^{(1)}H(t)\left( F^{(X_{1})}(y-th_{H})\right) -F^{x}(y)\, {\text {d}}t\\&+\int _{{\mathbb {R}}}2H(t)^{(1)}{H(t)}F^{x}(y){\text {d}}t. \end{aligned}$$

Since \(\int _{{\mathbb {R}}}2H^{(1)}(t)\, {H(t)}F^{x}(y){\text {d}}t=F^{x}(y)\), as \(n\rightarrow \infty\), we deduce that

$$\begin{aligned} {\text {IE}}\left( K_{1}^{2}H^{2}\left( \frac{y-Y_{1}}{h_{H}}\right) \big | X_{1}\right) \rightarrow {\text {IE}}(K_{1}^{2})F^{x}(y), \end{aligned}$$

and

$$\begin{aligned} {\text {IE}}\left( H\left( \frac{y-Y_{1}}{h_{H}}\right) \big | X_{1}\right) -F^{x}(y)\rightarrow 0. \end{aligned}$$

So, the term (24) tends to \((F^{x}(y))^{2}\) as n tends to infinity. Then

$$\begin{aligned}{\text {IE}}\left( K_{1}^{2}{\text {IE}}^{2}\left( H\left( \frac{y-Y_{1}}{h_{H}}\right) \big | X_{1}\right) \right) \rightarrow {\text {IE}}\left( K_{1}^{2}(F^{x}(y))^{2}\right) ={\text {IE}}\left( K_{1}^{2}\right) (F^{x}(y))^{2}.\end{aligned}$$

Finally, as \(n\rightarrow \infty\), we have

$$\begin{aligned} {\text {IE}}\left( K_{1}^{2}var\left( H\left( \frac{y-Y_{1}}{h}\right) \big |X_{1}\right) \right) \rightarrow {\text {IE}}(K_{1}^{2})F^{x}(y)\left( 1-F^{x}(y)\right) . \end{aligned}$$

\(\square\)

Proof of Lemma 6.7

For \(j\ge 0\) and \(l=1\), we have

$$\begin{aligned} \frac{1}{h^{l}_{H}}{\text {IE}}\left( \left( H^{(j+1)}\left( \frac{y-Y_{1}}{h_{H}}\right) \right) ^{2}\big | X_{1}\right)= & {} \int _{{\mathbb {R}}}(H^{(j+1)}(t))^{2}\left( f^{X_{1}}(y-t h_{H})\right) -f^{x}(y)\, {\text {d}}t\\&+ \int _{{\mathbb {R}}}(H^{(j+1)}(t))^{2}f^{x}(y){\text {d}}t. \end{aligned}$$

Remark that \(\left( f^{X_{1}}(y-t h_{H})-f^{x}(y)\right) \rightarrow o(1) \ \mathrm{as } \ n\rightarrow \infty\). Then, as \(n\rightarrow \infty\), we deduce that

$$\begin{aligned} \frac{1}{h_{H}}{\text {IE}}\left( \left( H^{(j+1)}\left( \frac{y-Y_{1}}{h_{H}}\right) \right) ^{2}\big | X_{1}\right) \rightarrow {\text {IE}}(K_{1}^{2})f^{x}(y)\int (H^{(j+1)}(t))^{2}{\text {d}}t, \end{aligned}$$

and

$$\begin{aligned} {\text {IE}}\left( H_{1}^{(j+1)}(y)\big | X_{1}\right)= & {} \int _{{\mathbb {R}}}H^{(j+1)}(h_{H}^{-1}(y-z))f^{X_{1}}(z){\text {d}}z\nonumber \\= & {} -\sum _{l=1}^{j}h^{l}_{H}\left[ H^{(j-l-1)}(h_{H}^{-1}(y-z))f^{X_{1}(l-1)}(z)\right] _{-\infty }^{+\infty }\nonumber \\&\ + h^{j}_{H}\int _{{\mathbb {R}}}H^{(1)}(h_{H}^{-1}(y-z))f^{X_{1}(j)}(z){\text {d}}z. \end{aligned}$$
(25)

So, assumption (H7) allows to cancel the first term on the right side of (25). Then, using a Taylor expansion followed by a change of variable, we obtain

$$\begin{aligned} \frac{1}{h_{H}}{\text {IE}}\left( H_{1}^{(j+1)}(y)\mid X_{1}\right) \; -\;h^{j}_{H}f^{X_{1}(j)}(y) \rightarrow 0 \quad \mathrm{as}\quad n\rightarrow \infty . \end{aligned}$$

\(\square\)

Proof of Lemma 6.6

We have that

$$\begin{aligned} \mathrm{var}\left( \widehat{f}^{x(j)}_{N} (y)\right)&= \frac{1}{\left( n(n-1)h^{j+1}_{H}{\text {IE}}\left( W_{12}\right) \right) ^{2}} \; \mathrm{var}\left( \displaystyle \sum _{i \ne k=1}^{n} W_{ik} H^{(j+1)}_{k}\right) \nonumber \\&= \frac{1}{\left( n(n-1)h^{j+1}_{H} {\text {IE}}\left( W_{12}\right) \right) ^{2}} \; \left( n(n-1) {\text {IE}}\left( W_{12}^{2} H_{2}^{(j+1)^{2}}\right) \right. \nonumber \\&\quad \left. + n(n-1) {\text {IE}}\left( W_{12} W_{21} H^{(j+1)}_{1} H^{(j+1)}_{2}\right) \right) \nonumber \\&\quad + n(n-1)(n-2) {\text {IE}}\left( W_{12} W_{13} H^{(j+1)}_{2} H^{(j+1)}_{3}\right) \nonumber \\&\quad + n(n-1)(n-2) {\text {IE}}\left( W_{12} W_{23} H^{(j+1)}_{2}H^{(j+1)}_{3}\right) \nonumber \\&\quad + n(n-1)(n-2) {\text {IE}}\left( W_{12} W_{31} H^{(j+1)}_{2} H^{(j+1)}_{1}\right) \nonumber \\&\quad + n(n-1)(n-2) {\text {IE}}\left( W_{12} W_{32}H_{2}^{(j+1)^{2}}\right) \nonumber \\&\quad - n(n-1)(4n-6)\left( {\text {IE}}\left( W_{12} H^{(j+1)}_{2}\right) \right) ^{2}. \end{aligned}$$
(26)

Observe that

$$\begin{aligned} \!\!\!\!\!\!\!\!\!\! {\text {IE}}\left( W_{12}H^{(j+1)}_{1}\right)&= {\text {IE}}((\beta _{1}^{2}K_{1}K_{2}-\beta _{1}\beta _{2}K_{1}K_{2})H_{1}^{(j+1)})\nonumber \\&\le {\text {IE}}(\beta _{1}^{2}K_{1})\cdot {\text {IE}}(H_{1}^{j+1}K_{1}) = O(h^{j+1}_{H}h^{2}_{K}\phi ^{2}_{x}(h_{K})). \end{aligned}$$
(27)

Then

$$\begin{aligned} \frac{1}{h^{2j+2}_{H}}\frac{{\text {IE}}\left( W_{12}H^{(j+1)}_{1}\right) ^{2}}{{\text {IE}}\left( W_{12}\right) } =\frac{1}{h^{2j+2}_{H}}\frac{ O(h^{2j+2}_{H}h^{4}_{K}\phi ^{4}_{x}(h_{K})) }{O( h^{4}_{K}\phi ^{4}_{x}(h_{K}))}=O(1). \end{aligned}$$

By some simple manipulations and using Lemma 6.1’s result, we get

$$\begin{aligned} \left\{ \begin{array}{lll} {\text {IE}}\left( W_{12}^{2} H_{2}^{(j+1)^{2}}\right) &{} = &{} O (h_{K}^{4}h_{H} \phi _{x}^{2}(h_{K})), \\ {\text {IE}}\left( W_{12} W_{21} H^{(j+1)}_{1}H^{(j+1)}_{2}\right) &{} = &{} O (h_{K}^{4}h^{2j+2}_{H}\phi _{x}^{2}(h_{K})),\\ {\text {IE}}\left( W_{12} W_{13} H^{(j+1)}_{2}H^{(j+1)}_{3}\right) &{} = &{} O(h_{K}^{4}h^{2j+2}_{H}\phi _{x}^{3}(h_{K})), \\ {\text {IE}}\left( W_{12}W_{23} H^{(j+1)}_{2} H^{(j+1)}_{3}\right) &{} = &{}O(h_{K}^{4} h^{2j+2}_{H}\phi _{x}^{3}(h_{K}))\\ {\text {IE}}\left( W_{12} W_{31} H^{(j+1)}_{2} H^{(j+1)}_{1}\right) &{} = &{}O(h_{K}^{4} h^{2j+2}_{H}\phi _{x}^{3}(h_{K}))\\ {\text {IE}}\left( W_{12} W_{32}H_{2}^{(j+1)^{2}}\right) &{} = &{}({\text {IE}}\left( \beta _{1}^{2}K_{1}\right) )^{2}{\text {IE}}\left( K_{1}^{2}H^{(j+1)^{2}}_{1}\right) \\ &{}&{}+ O (h_{K}^{4}h_{H} \phi _{x}^{3}(h_{K})),\\ {\text {IE}}\left( W_{12} H^{(j+1)}_{2}\right) ^{2} &{} = &{} O(h_{K}^{4}h^{2j+2}_{H} \phi _{x}^{2}(h_{K})). \end{array} \right. \end{aligned}$$
(28)

Therefore, the leading term in the expression of \(\mathrm{var}\left( \widehat{f}^{x(j)}_{N} (y)\right)\) is

$$\begin{aligned} \frac{n(n-1)(n-2)}{(n(n-1)h^{j+1}_{H}{\text {IE}}(W_{12}))^{2}}{\text {IE}}\left( \beta _{1}^{2}K_{1}\right) ^{2}{\text {IE}}\left( K_{1}^{2}\left( H^{(j+1)}_{1}\right) ^{2}\right) . \end{aligned}$$

Combining Eqs. (26), (27) with (28), we obtain

$$\begin{aligned} \mathrm{var}\left( \widehat{f}^{x(j)}_{N}(y)\right) =\frac{{\text {IE}}\left( K_{1}^{2}\left( H^{(j+1)}_{1}\right) ^{2}\right) }{nh^{2j+2}_{H}({\text {IE}}\left( K_{1}\right) ^{2}}+o\left( \frac{1}{nh^{2j+1}_{H}\phi _{x}(h_{K})}\right) . \end{aligned}$$
(29)

Remark that

$$\begin{aligned} {\text {IE}}\left( K_{1}^{2}\left( H^{(j+1)}_{1}\right) ^{2}\right) ={\text {IE}}\left( K_{1}^{2}{\text {IE}}\left( \left( H^{(j+1)}_{1}\right) ^{2}\big |X\right) \right) . \end{aligned}$$

Hence, from Lemma 6.4’s proof, as \(n\rightarrow \infty\), we obtain

$$\begin{aligned} {\text {IE}}\left( K_{1}^{2}\left( H^{(j+1)}_{1}\right) ^{2}\left( \frac{y-Y_{1}}{h_{H}}\right) \big | X_{1}\right) \rightarrow h_{H}{\text {IE}}(K_{1}^{2})f^{x}(y)\int H^{(j+1)}(t){\text {d}}t. \end{aligned}$$
(30)

Combining Eqs. (29) and (30), leads to

$$\begin{aligned} \mathrm{var}\left( \widehat{f}^{x(j)}_{N}(y)\right)= & {} \frac{f^{x}(y)\int \left( H^{(j+1)}(t)\right) ^{2}{\text {d}}t}{nh^{2j+1}_{H}\phi _{x}(h_{K})} \frac{M_{2}}{M_{1}^{2}} + o\left( \frac{1}{nh^{2j+1}_{H}\phi _{x}(h_{K})}\right) . \end{aligned}$$

\(\square\)

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bouanani, O., Laksaci, A., Rachdi, M. et al. Asymptotic normality of some conditional nonparametric functional parameters in high-dimensional statistics. Behaviormetrika 46, 199–233 (2019). https://doi.org/10.1007/s41237-018-0057-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s41237-018-0057-9

Keywords

Mathematics Subject Classification

Navigation