Abstract
In this paper, we consider the asymptotic properties of the nearest neighbors estimation for long memory functional data. Under some regularity assumptions, we investigate the asymptotic normality and the uniform consistency of the nearest neighbors estimators for the nonparametric regression models when the explanatory variable and the errors are of long memory and the explanatory variable takes values in some abstract functional space. The finite sample performance of the proposed estimator is discussed through simulation studies.
Similar content being viewed by others
References
Al-Awadhi FA, Kaid Z, Laksaci A, Ouassou I, Rachdi M (2018) Functional data analysis: local linear estimation of the \(L_1\)-conditional quantiles. Stat Methods Appl. https://doi.org/10.1007/s10260-018-00447-5
Benhenni K, Hedli-Griche S, Rachdi M, Vieu P (2008) Consistency of the regression estimator with functional data under long memory conditions. Statist Probab Lett 78:1043–1049
Benhenni K, Hedli-Griche S, Rachdi M (2010) Estimation of the regression operator from functional fixed-design with correlated errors. J Multivar Anal 101:476–490
Benhenni K, Hedli-Griche S, Rachdi M (2017) Regression models with correlated errors based on functional random design. TEST 26:1–21
Beran J, Feng Y, Ghosh S, Kulik R (2013) Long-memory processes. Probabilistic properties and statistical methods. Springer, Hiedelberg
Beran J, Feng Y, Ghosh S (2015) Modelling long-range dependence and trends in duration series: an approach based on EFARIMA and ESEMIFAR models. Statist Pap 56:431–451
Bertram P, Kruse R, Sibbertsen P (2013) Fractional integration versus level shifts: the case of realized asset correlations. Statist Pap 54:977–991
Bongiorno EG, Salinelli E, Goia A, Vieu P (2014) Contributions in infinite-dimensional statistics and related topics. Società Editrice Esculapio, Bologna
Burba F, Ferraty F, Vieu P (2009) k-Nearest Neighbour method in functional nonparametric regression. J Nonparametr Stat 21:453–469
Chagny G, Roche A (2016) Adaptive estimation in the functional nonparametric regression model. J Multivar Anal 146:105–118
Chen J, Zhang L (2009) Asymptotic properties of nonparametric M-estimation for mixing functional data. J Statist Plan Inference 139:533–546
Csörgő S, Mielniczuk J (1995) Nonparametric regression under long-range dependent normal errors. Ann Stat 23:1000–1014
Csörgő M, Szyszkowicz B, Wang L (2005) Limit theorems for nearest-neighbor density estimation under long-range dependence. J Statist Res 39:121–138
Cuevas A (2014) A partial overview of the theory of statistics with functional data. J Statist Plan Inference 147:1–23
Doukhan P, Lang G, Surgailis D (2002) Asymptotics of weighted empirical processes of linear fields with long-range dependence. Ann Inst H Poincaré Probab Statist 38:879–896
Doukhan P, Oppenheim G, Taqqu MS (2003) Theory and applications of long-range dependence. Birkhäuser, Boston
Ferraty F, Vieu P (2004) Nonparametric models for functional data, with application in regression, time-series prediction and curve discrimination. J Nonparametr Stat 16:111–125
Ferraty F, Vieu P (2006) Nonparametric functional data analysis. Springer, New York
Ferraty F, Vieu P (2008) Erratum of: ’non-parametric models for functional data, with application in regression, time-series prediction and curve discrimination’. J Nonparametr Stat 20:187–189
Gasser T, Hall P, Presnell B (1998) Nonparametric estimation of the mode of a distribution of random curves. J R Stat Soc Ser B Stat Methodol 60:681–691
Giraitis L, Koul HL, Surgailis D (2012) Large sample inference for long memory processes. Imperial College Press, London
Goia A, Vieu P (2016) An introduction to recent advances in high/infinite dimensional statistics. J Multivar Anal 146:1–6
Guégan D (2005) How can we define the concept of long memory? An econometric survey. Econom Rev 24:113–149
Guo H, Koul HL (2007) Nonparametric regression with heteroscedastic long memory errors. J Statist Plan Inference 137:379–404
Hassler U, Scheithauer J (2011) Detecting changes from short to long memory. Statist Pap 52:847–870
Horváth L, Kokoszka P (2012) Inference for functional data with applications, vol 200. Springer, Berlin
Hsing T, Eubank R (2015) Theoretical foundations of functional data analysis, with an introduction to linear operators. Wiley series in probability and statistics. Wiley, Chichester
Kara-Zaitri L, Laksaci A, Rachdi M, Vieu P (2017a) Uniform in bandwidth consistency for various kernel estimators involving functional data. J Nonparametr Stat 29:85–107
Kara-Zaitri L, Laksaci A, Rachdi M, Vieu P (2017b) Data-driven kNN estimation in nonparametric functional data analysis. J Multivar Anal 153:176–188
Kudraszowa NL, Vieu P (2013) Uniform consistency of kNN regressors for functional variables. Statist Probab Lett 83:1863–1870
Lian H (2011) Convergence of functional k-nearest neighbor regression estimate with functional responses. Electron J Stat 5:31–40
Masry E (2005) Nonparametric regression estimation for dependent functional data: asymptotic normality. Stoch Process Appl 115:155–177
Masry E, Mielniczuk J (1999) Local linear regression estimation for time series with long-range dependence. Stoch Process Appl 82:173–193
Messaci F, Nemouchi N, Ouassou I, Rachdi M (2015) Local polynomial modelling of the conditional quantile for functional data. Stat Methods Appl 24:597–622
Robinson PM (2011) Asymptotic theory for nonparametric regression with spatial data. J Econom 165:5–19
Taqqu MS (1975) Weak convergence to fractional brownian motion and to the rosenblatt process. Z. Wahrscheinlichkeitstheorie verw. Gebiete 31:287–302
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work was supported by National Natural Science Foundation of China (NSFC) Grants 11671194 and 11501287.
Appendix
Appendix
Proof of Lemma 1
It suffices to show that, as \(n\rightarrow \infty \),
and
Let \(\xi _{n1}=\min (H_n, h_n)\) and \(\xi _{n2}=\max (H_n, h_n)\). Then, by (8), \(\xi _{n1}=h_n(1+ o(n^{-\rho }))\), \(\xi _{n2}=h_n(1+ o(n^{-\rho }))\) and \(\xi _{n2}-\xi _{n1}=o(h_nn^{-\rho })\) a.s.
Since E\(\varepsilon _0=0\) and \(k_n=nh_nf_x(0)\), by Assumption (A4),
where \(\xi _{n1}<\zeta _n<\xi _{n2}\). Hence (12), together with Assumption (A2) and the fact that \(\zeta _n=h_n(1+o(1))\) a.s. implies (10).
Let \(Z_{i1}=(K(d(x,X_i)/H_n)-K(d(x,X_i)/h_n))(r(X_i)-r(x))\) and \(Z_{i2}=(K(d(x,X_i)/H_n)-K(d(x,X_i)/h_n))\varepsilon _i\). To prove (11), it is enough to show
Note that
For the first term of the above variance, similarly to (12), we obtain
This means that
In addition, by mean value theorem, \(F_x(\xi _{n2})-F_x(\xi _{n1})=f_x(\zeta _n) (\xi _{n2}-\xi _{n1})\) for some \(\xi _{n1}< \zeta _n<\xi _{n2}\). Moreover, by Assumption (A5),
for any \(u_1\), \(u_2\) close to 0. Thus, by (8) and Assumption (A4),
Now using (7) and Assumption (A2), we arrive at, for some large enough N,
These bounds imply the weak convergence of \(n^{1/2-\beta -\alpha }\sum _{i=1}^n Z_{i1}\). To derive the same result for \(n^{1/2-\beta -\alpha }\sum _{i=1}^n Z_{i2}\), note that, by (2),
Since \(\rho >0\) and \(\tau _x\ge 1\), Assumption (A2) implies that \(1-2\beta -\alpha -\rho <0\) and \(2-2\alpha -\tau _x D<0\). That is,
This completes the proof of Lemma 1. \(\square \)
Proof of Theorem 1
where \(I_{n1}=n^{1/2-\beta }k_n^{-1}\sum _{i=1}^n K(d(x,X_i)/h_n)(r(X_i)-r(x))\) and \(I_{n2}=n^{1/2-\beta }\) \(k_n^{-1}\) \(\sum _{i=1}^n K(d(x,X_i)/h_n)\varepsilon _i\).
By Lemma 1, it suffices to show that
Along the similar lines of the Proof of Lemma 1, we obtain
where \(0<\zeta _n<h_n\). Moreover, again by Assumption (A2),
Then we arrive at \(I_{n1}{\mathop {\longrightarrow }\limits ^\mathcal{P}} 0\). For \(I_{n2}\), note that
Since
where \(0<\zeta _n<h_n\), by (9), we have
Therefore, to complete the proof, it suffices to show the first term of (15) tends to 0 in probability. In a similar way as in (14), the variance for the first term of (15) is bounded by
This concludes the Proof of Theorem 1. \(\square \)
Proof of Theorem 2
From (6), we obtain
It suffices to show that
By (8) and Assumption (A4), we have
This is enough to prove the first claim of (16). It just remains to check the second result of (16). Note that we can write
while,
By (8) and Assumption (B2), we have
for any \(x\in S\). This result implies that, for any \(x\in S\) and any \(1\le i\le n\),
Moreover, for any \(\delta >0\), there exists an \(x^*\in S\) with
Let \(\delta \rightarrow 0\), we obtain
This leads directly to
Looking at the second term on the RHS of (17), we have, for any \(\varepsilon >0\),
By using the similar lines of the proof of (13), and by Assumptions (A2), (B1) and (B3), one gets directly for any \(\varepsilon >0\),
This leads to
This, together with (18), is enough to get
This allows us to finish the Proof of Theorem 2. \(\square \)
Rights and permissions
About this article
Cite this article
Wang, L. Nearest neighbors estimation for long memory functional data. Stat Methods Appl 29, 709–725 (2020). https://doi.org/10.1007/s10260-019-00499-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10260-019-00499-1