Skip to main content
Log in

Asymptotics of a wavelet estimator in the nonparametric regression model with repeated measurements under a NA error process

  • Original Paper
  • Published:
Revista de la Real Academia de Ciencias Exactas, Fisicas y Naturales. Serie A. Matematicas Aims and scope Submit manuscript

Abstract

Consider the nonparametric regression model with repeated measurements: \(Y^{(j)}(x_{ni})=g(x_{ni})+e^{(j)}(x_{ni})\), where \(Y^{(j)}(x_{ni})\) is the \(j\)th response at the point \(x_{ni}\), \(x_{ni}\)’s are known and nonrandom, and \(g(\cdot )\) is unknown function defined on a closed interval \([0,1]\). For exhibiting the correlation among the units and avoiding any assumptions among the observations within the same unit, we consider the model with negative associated (NA) error structures, that is, \(\{e^{(j)}(x), j\ge 1\}\) is a mean zero NA error process. The wavelet procedures are developed to estimate the regression function. Some asymptotics of wavelet estimator are established under suitable conditions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Antoniadis, A., Gregoire, G., Mckeague, I.W.: Wavelet methods for curve estimation. J. Am. Stat. Assoc. 89, 1340–1352 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  2. Clark, R.M.: Nonparametric estimation of a smooth regression function. J. R. Stat. Soc. Ser. B 39, 107–113 (1997)

    Google Scholar 

  3. Fraiman, R., Iribarren, G.P.: Nonparametric regression estimation in models with weak error’s structure. J. Multivar. Anal. 37, 180–196 (1991)

    Article  MATH  Google Scholar 

  4. Hart, J.D., Wehrly, T.E.: Kernel regression estimation using repeated measurements data. J. Am. Stat. Assoc. 81, 1080–1088 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  5. Joag-dev, K., Proschan, F.: Negative association of random variables with applications. Ann. Stat. 11, 286–295 (1983)

    Article  MATH  MathSciNet  Google Scholar 

  6. Li, Y.M., Guo, J.H.: Asymptotic normality of wavelet estimator for strong mixing errors. J. Korean Stat. Soc. 38, 383–390 (2009)

    Article  MATH  Google Scholar 

  7. Li, Y.M., Yang, S.C., Zhou, Y.: Consistency and uniformly asymptotic normality of wavelet estimator in regression model with associated samples. Stat. Probab. Lett. 78, 2947–2956 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  8. Liang, H.Y.: Complete convergence for weighted sums of negatively associated random variables. Stat. Probab. Lett. 48(4), 317–325 (2000)

    Article  MATH  Google Scholar 

  9. Liang, H.Y., Baek, J.I.: Weighted sums of negatively associated random variables. Aust. N. Z. J. Stat. 48, 21–31 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  10. Liang, H.Y., Jing, B.Y.: Asymptotic properties for estimates of nonparametric regression models based on negatively associated sequences. J. Multivar. Anal. 95, 227–245 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  11. Liang, H.Y., Qi, Y.Y.: Asymptotic normality of wavelet estimator of regression function under NA assumptions. Bull. Korean Math. Soc 44, 247–257 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  12. Liang, H.Y., Zhang, D.X., Lu, B.X.: Wavelet estimation in nonparametric model under martingale difference errors. Appl. Math. J. Chin. Univ. Ser. B 19, 302–310 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  13. Matula, P.: A note on the almost sure convergence of sums of negatively dependent random variables. Stat. Probab. Lett. 15, 209–213 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  14. Newman, C.M.: Asymptotic independence and limit theorems for positively and negatively dependent random variables. In: Tong, Y.L. (ed.) Inequalities in Statistics and Probability, IMS Lecture Notes-Monograph Series, vol. 5, pp. 127–140 (1984)

  15. Prakasa Rao, B.L.S.: Nonparametric Functional Estimation. Academic Press, Orlando (1983)

    MATH  Google Scholar 

  16. Roussas, G.G.: Consistent regression with fixed design points under dependence condition. Stat. Probab. Lett. 8, 41–50 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  17. Roussas, G.G.: Asymptotic normality of random fields of positively or negatively associated processes. J. Multivar. Anal. 50, 152–173 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  18. Roussas, G.G., Tran, L.T., Ioannides, D.A.: Fixed design regression for time series: asymptotic normality. J. Multivar. Anal. 40, 262–291 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  19. Shao, Q.M.: A comparison theorem on moment inequalities between negatively associated and independent random variables. J. Theoret. Probab. 13, 343–356 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  20. Shao, Q.M., Su, C.: The law of the iterated logarithm for negatively associated random variables. Stoch. Process. Appl. 83, 139–148 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  21. Su, C., Wang, Y.B.: Strong convergence for IDNA sequence. Chin. J. Appl. Probab. Stat. 14, 131–140 (1998)

    MATH  MathSciNet  Google Scholar 

  22. Sun, Y., Chai, G.X.: Nonparametric wavelet estimation of a fixed designed regression function. Acta Mathematica Scientia 24A, 597–606 (2004)

    MathSciNet  Google Scholar 

  23. Vidakovic, B.: Statistical Modeling by Wavelet. Wiley, New York (1999)

    Book  Google Scholar 

  24. Walter, G.G.: Wavelets and Orthogonal Systems with Applications. CRC Press Inc, Florida (1994)

    MATH  Google Scholar 

  25. Xue, L.G.: Strong uniform convergence rates of the wavelet estimator of regression function under completed and censored data. Acta. Math. Appl. Sinica 25, 430–438 (2002)

    MATH  Google Scholar 

  26. Xue, L.G.: Uniform convergence rates of the wavelet estimator of regression function under mixing error. Acta Mathematica Scientia 22A, 528–535 (2002)

    Google Scholar 

  27. Yang, S.C.: Maximal moment inequality for partial sums of strong mixing sequences and application. Acta Mathematica Sinica (Engl. Ser.) 23, 1013–1024 (2007)

    Article  MATH  Google Scholar 

  28. Yang, S.C., Li, Y.M.: Uniform asymptotic normality of the regression weighted estimator for strong mixing samples. Acta Mathematica Sinica 49A, 1163–1170 (2006)

    Google Scholar 

  29. Yuan, M., Su, C., Hu, T.Z.: A central limit theorem for random fields of negatively associated processes. J. Theor. Probab. 16, 309–323 (2003)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The authors are grateful to the Editor and two anonymous referees for their constructive comments which have greatly improved this paper. This work is partially supported by Anhui Provincial Natural Science Foundation of China (No. 1408085MA03), Key Natural Science Foundation of Higher Education Institutions of Anhui Province of China (No. KJ2012A270), NSFC (No. 11171065), FDPHEC (No. 20120092110021), Youth Foundation for Humanities and Social Sciences Project from Ministry of Education of China (No. 11YJC790311), China Postdoctoral Science Foundation (No. 2013M540402), Key grant project for academic leaders of Tongling University (No. 2014tlxyxs13), and Scientific Research Starting Foundation for Talents of Tongling University (No. 2012tlxyrc05).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xing-cai Zhou.

Appendix

Appendix

In this section, we given some preliminary Lemmas, which have been used in Sect. 3.

Lemma 4.1

([1, 24]) Suppose that \((\)A3\()(\)iii\()\) holds. We have

  1. (a)

    \(\sup _{0\le x,s \le 1}|E_k(x,s)|=O(2^k)\)

  2. (b)

    \(\sup _{0\le x\le 1}\int _0^1|E_k(x,s)|ds\le C\)

  3. (c)

    \(\int _0^1 E_k(x,s)ds\rightarrow 1\) uniformly in \(x\in [0,1]\), as \(k\rightarrow \infty \)

Lemma 4.2

([1]) Suppose that \((\)A3\()(\)iii\()(\)iv\()\) hold, and \(h(x)\) satisfies \((\)A3\()(\)i\()(\)ii\()\). Then

$$\begin{aligned} \sup _{x\in \mathcal {I}}\left| h(x)-\sum _{i=1}^n h(x_i)\int \limits _{A_i}E_k(x,s)ds\right| =O(n^{-\gamma })+O(\tau _k), \end{aligned}$$

where

$$\begin{aligned} \tau _k=\left\{ \begin{array}{l@{\quad }c@{\quad }l} (1/2^k)^{\upsilon -1/2} &{} if &{} 1/2<\upsilon <3/2, \\ \sqrt{k}/2^k &{} if &{} \upsilon =3/2, \\ 1/2^k &{} if &{} \upsilon >3/2.\\ \end{array}\right. \end{aligned}$$

Lemma 4.3

([20]) Let \(\{X_i,i\ge 1\}\) be a sequence of NA random variables with \(E|X_i|^p<\infty \) for some \(p\ge 2\). Then, there exists a constant \(C_p>0\) such that

$$\begin{aligned} E\max _{1\le k\le n}\left| \sum _{i=1}^k X_i\right| ^p\le C_p\left\{ \left( \sum _{i=1}^nEX_i^2\right) ^{p/2}+\sum _{i=1}^nE|X_i|^p\right\} . \end{aligned}$$

Lemma 4.4

([13, 21]) Let \(\{X_i,i\ge 1\}\) be a sequence of identically distributed NA random variables, if \(E|X_1|<\infty \), then

$$\begin{aligned} \frac{1}{n}\sum _{i=1}^n(X_i-EX_1)\rightarrow 0, ~a.s.. \end{aligned}$$

Lemma 4.5

([14]) Let \(\{X_i,i\ge 1\}\) be a sequence of strictly stationary NA random variables with \(EX_1=0\), \(0<EX_1^2<\infty \) and

$$\begin{aligned} \sigma ^2=EX_1^2+2\sum _{j=2}^\infty EX_1X_j>0, \end{aligned}$$

then

$$\begin{aligned} \frac{1}{\sigma \sqrt{n}}\sum _{i=1}^nX_i\mathop {\longrightarrow }\limits ^{d}N(0,1). \end{aligned}$$

Lemma 4.6

Let \(V^{(j)}(x)=\sum _{i=1}^ne^{(j)}(x_{ni})\int _{A_i}E_k(x,s)ds\). If \(\max _{1\le j\le m,1\le i\le n}E|e^{(j)}(x_{ni})|^{1+\delta }<\infty \) for some \(\delta >0\), then

$$\begin{aligned} \sup _{x\in \mathcal {I}}E|V^{(j)}(x)|^{1+\delta }<C. \end{aligned}$$

Proof

Note that \(f(t)=t^{1+\delta } (t>0)\) is a convex function on real set \(\mathbb {R}\). We have

$$\begin{aligned} \sup _{x\in \mathcal {I}}E\left| V^{(j)}(x)\right| ^{1+\delta }&= \sup _{x\in \mathcal {I}}E\left[ \left( \sum _{i=1}^ne^{(j)}(x_{ni})\int \limits _{A_i}E_k(x,s)ds\right) ^{1+\delta }\right] \\&\le \sup _{x\in \mathcal {I}}E\left[ \left( \sum _{i=1}^n|e^{(j)}(x_{ni})|\int \limits _{A_i}|E_k(x,s)|ds\right) ^{1+\delta }\right] \\&= \sup _{x\in \mathcal {I}}E\left[ \left( \sum _{i=1}^n\frac{\int \limits _{A_i}|E_k(x,s)|ds}{\int \limits _{0}^1|E_k(x,s)|ds}\cdot |e^{(j)}(x_{ni})|\int \limits _{0}^1|E_k(x,s)|ds\right) ^{1+\delta }\right] \\&\le \sup _{x\in \mathcal {I}}\sum _{i=1}^n\frac{\int _{A_i}|E_k(x,s)|ds}{\int \limits _{0}^1|E_k(x,s)|ds}\cdot E\left[ \left( |e^{(j)}(x_{ni})|\int \limits _{0}^1|E_k(x,s)|ds\right) ^{1+\delta }\right] \\&\le \sup _{x\in \mathcal {I}}\sum _{i=1}^n\frac{\int \limits _{A_i}|E_k(x,s)|ds}{\int \limits _{0}^1|E_k(x,s)|ds}\cdot E\left( |e^{(j)}(x_{ni})|^{1+\delta }\right) \left( \int \limits _{0}^1|E_k(x,s)|ds\right) ^{1+\delta }\\&\le \max _{1\le i\le n}E|e^{(j)}(x_{ni})|^{1+\delta }\sup _{x\in \mathcal {I}}\left( \int \limits _{0}^1|E_k(x,s)|ds\right) ^{1+\delta }\le C. \end{aligned}$$

\(\square \)

Lemma 4.7

Assume that \(\max _{1\le i\le n}E|e^{(j)}(x_{ni})|^p<\infty \) for each \(j (1\le j\le m)\) and some \(p\ge 2\). Then there exists a constant \(C_p\) such that

$$\begin{aligned} E\max _{1\le l\le m}\left| \sum _{j=1}^lV^{(j)}(x)\right| ^p\le C_p\left\{ \left( \sum _{j=1}^m\max _{1\le i\le n}E|e^{(j)}(x_{ni})|^2\right) ^{p/2}+\sum _{j=1}^m\max _{1\le i\le n}E|e^{(j)}(x_{ni})|^p\right\} . \end{aligned}$$

Proof

Let \(a_{ni}(x)=\int _{A_i}E_k(x,s)ds\), then \(V^{(j)}(x)=\sum _{i=1}^n e^{(j)}(x_{ni})a_{ni}(x)\). Obviously, \(V^{(j)}(x)=\sum _{i=1}^n e^{(j)}(x_{ni})a_{ni}^+(x)-\sum _{i=1}^n e^{(j)}(x_{ni})a_{ni}^-(x)\). Without loss of generality we can assume that \(a_{ni}(x)\ge 0\), \(i=1,\cdots ,n\), for each \(x\in \mathcal {I}\). So \(\{V^{(j)}(x), 1\le j\le m\}\) still are zero mean NA random variables. Hence, by using Lemma A.3, and with arguments similar to proof of Lemma A.6, we have

$$\begin{aligned} E\max _{1\le l\le m}\left| \sum _{j=1}^lV^{(j)}(x)\right| ^p&\le C_p\left\{ \left( \sum _{j=1}^mE|V^{(j)}(x)|^2\right) ^{p/2}+\sum _{j=1}^mE|V^{(j)}(x)|^p\right\} \\&\le C_p\left\{ \left( \sum _{j=1}^m\max _{1\le i\le n}E|e^{(j)}(x_{ni})|^2\right) ^{p/2}+\sum _{j=1}^m\max _{1\le i\le n}E|e^{(j)}(x_{ni})|^p\right\} . \end{aligned}$$

Thus, we complete the proof. \(\square \)

Lemma 4.8

Let \(\{X_i,i\ge 1\}\) be a sequence of identically distributed NA random variables, if \(E|X_1|^{1+\delta }<\infty \) for some \(\delta \ge 0\), then

$$\begin{aligned} \frac{1}{n}\sum _{i=1}^n|X_i|^{1+\delta }\rightarrow E|X_1|^{1+\delta },\,a.s.. \end{aligned}$$

Proof

Let \(|X_i|^{1+\delta }=(X_i^+)^{1+\delta }+(X_i^-)^{1+\delta }\). Note that \(\{(X_i^+)^{1+\delta }, i\ge 1\}\) and \(\{(X_i^-)^{1+\delta }, i\ge 1\}\) still are the sequences of identically distributed NA random variables with \(E|X_1^+|^{1+\delta }<\infty \) and \(E|X_1^-|^{1+\delta }<\infty \), respectively. By Lemma A.4, we have

$$\begin{aligned} \frac{1}{n}\sum _{i=1}^n(|X_i^+|^{1+\delta }-E|X_1^+|^{1+\delta })\rightarrow 0, \,a.s., \quad \frac{1}{n}\sum _{i=1}^n(|X_i^-|^{1+\delta }-E|X_1^-|^{1+\delta })\rightarrow 0, \,a.s.. \end{aligned}$$

Hence, we obtain result of Lemma A.8. \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, Xc., Lin, Jg. Asymptotics of a wavelet estimator in the nonparametric regression model with repeated measurements under a NA error process. RACSAM 109, 153–168 (2015). https://doi.org/10.1007/s13398-014-0172-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13398-014-0172-8

Keywords

Mathematics Subject Classification (2000)

Navigation