Skip to main content
Log in

On consistency of wavelet estimator in nonparametric regression models

  • Regular Article
  • Published:
Statistical Papers Aims and scope Submit manuscript

Abstract

In this paper, we mainly investigate the nonparametric regression model with repeated measurements based on extended negatively dependent (END, in short) errors. Based on the Rosenthal type inequality and the Marcinkiewicz–Zygmund type strong law of large numbers, the mean consistency, weak consistency, strong consistency, complete consistency and strong convergence rate of the wavelet estimator are established under some mild conditions, which generalize the corresponding ones for negatively associated errors. Some numerical simulations are presented to verify the validity of the theoretical results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  • Antoniadis A, Gregoire G, Mckeague IW (1994) Wavelet methods for curve estimation. J Am Stat Assoc 89:1340–1352

    Article  MathSciNet  Google Scholar 

  • Chen Y, Chen A, Ng KW (2010) The strong law of large numbers for extend negatively dependent random variables. J Appl Probab 47:908–922

    Article  MathSciNet  Google Scholar 

  • Chen ZY, Wang HB, Wang XJ (2016) The consistency for the estimator of nonparametric regression model based on martingale difference errors. Stat Pap 57(2):451–469

    Article  MathSciNet  Google Scholar 

  • Fan Y (1990) Consistent nonparametric multiple regression for dependent heterogeneous processes. J Multivar Anal 33(1):72–88

    Article  Google Scholar 

  • Fraiman R, Iribarren GP (1991) Nonparametric regression estimation in models with weak error’s structure. J Multivar Anal 37:180–196

    Article  MathSciNet  Google Scholar 

  • Georgiev AA (1988) Consistent nonparametric multiple regression: the fixed design case. J Multivar Anal 25(1):100–110

    Article  MathSciNet  Google Scholar 

  • Georgiev AA, Greblicki W (1986) Nonparametric function recovering from noisy observations. J Stat Plan Inference 13:1–14

    Article  MathSciNet  Google Scholar 

  • Hart JD, Wehrly TE (1986) Kernel regression estimation using repeated measurements data. J Am Stat Assoc 81:1080–1088

    Article  MathSciNet  Google Scholar 

  • Joag-Dev K, Proschan F (1983) Negative association of random variables with applications. Ann Stat 11(1):286–295

    Article  MathSciNet  Google Scholar 

  • Li YM, Guo JH (2009) Asymptotic normality of wavelet estimator for strong mixing errors. J Korean Stat Soc 38:383–390

    Article  MathSciNet  Google Scholar 

  • Li YM, Yang SC, Zhou Y (2008) Consistency and uniformly asymptotic normality of wavelet estimator in regression model with associated samples. Stat Probab Lett 78:2947–2956

    Article  MathSciNet  Google Scholar 

  • Liang HY (2011) Asymptotic normality of wavelet estimator in heteroscedastic model with $\alpha $-mixing errors. J Syst Sci Complex 24:725–737

    Article  MathSciNet  Google Scholar 

  • Liang HY, Jing BY (2005) Asymptotic properties for estimates of nonparametric regression models based on negatively associated sequences. J Multivar Anal 95:227–245

    Article  MathSciNet  Google Scholar 

  • Liu L (2009) Precise large deviations for dependent random variables with heavy tails. Stat Probab Lett 79:1290–1298

    Article  MathSciNet  Google Scholar 

  • Liu L (2010) Necessary and sufficient conditions for moderate deviations of dependent random variables with heavy tails. Sci China Ser A 53(6):1421–1434

    Article  MathSciNet  Google Scholar 

  • Müller HG (1987) Weak and universal consistency of moving weighted averages. Period Math Hung 18(3):241–250

    Article  MathSciNet  Google Scholar 

  • Qiu DH, Chen PY, Antonini RG, Volodin A (2013) On the complete convergence for arrays of rowwise extended negatively dependent random variables. J Korean Math Soc 50(2):379–392

    Article  MathSciNet  Google Scholar 

  • Roussas GG, Tran LT, Ioannides DA (1992) Fixed design regression for time series: asymptotic normality. J Multivar Anal 40:262–291

    Article  MathSciNet  Google Scholar 

  • Shen AT (2011) Probability inequalities for END sequence and their applications. J Inequal Appl 2011, Article ID 98

  • Shen AT (2014) On asymptotic approximation of inverse moments for a class of nonnegative random variables. Statistics 48(6):1371–1379

    Article  MathSciNet  Google Scholar 

  • Shen AT, Zhang Y, Volodin A (2015) Applications of the Rosenthal-type inequality for negatively superadditive dependent random variables. Metrika 78(3):295–311

    Article  MathSciNet  Google Scholar 

  • Stout WF (1974) Almost sure convergence. Academic Press, New York

    MATH  Google Scholar 

  • Tran LT, Roussas GG, Yakowitz S, Van BT (1996) Fixed-design regression for linear time series. Ann Stat 24:975–991

    Article  MathSciNet  Google Scholar 

  • Walter GG (1994) Wavelets and orthogonal systems with applications. CRC Press Inc, Florida

    MATH  Google Scholar 

  • Wang SJ, Wang XJ (2013) Precise large deviations for random sums of END real-valued random variables with consistent variation. J Math Anal Appl 402:660–667

    Article  MathSciNet  Google Scholar 

  • Wang XJ, Xu C, Hu TC, Volodin A, Hu SH (2014) On complete convergence for widely orthant-dependent random variables and its applications in nonparametric regression models. Test 23:607–629

    Article  MathSciNet  Google Scholar 

  • Wang XJ, Zheng LL, Xu C, Hu SH (2015) Complete consistency for the estimator of nonparametric regression models based on extended negatively dependent errors. Statistics 49(2):396–407

    Article  MathSciNet  Google Scholar 

  • Wu QY (2006) Probability limit theory for mixing sequences. Science Press, Beijing

    Google Scholar 

  • Wu QY (2012) A complete convergence theorem for weighted sums of arrays of rowwise negatively dependent random variables. J Inequal Appl 2012, Article ID 50

  • Wu YF, Guan M (2012) Convergence properties of the partial sums for sequences of END random variables. J Korean Math Soc 49(6):1097–1110

    Article  MathSciNet  Google Scholar 

  • Xue LG (2002) Strong uniform convergence rates of the wavelet estimator of regression function under completed and censored data. Acta Math Appl Sin 25:430–438

    MathSciNet  MATH  Google Scholar 

  • Yang WZ, Xu HY, Chen L, Hu SH (2018) Complete consistency of estimators for regression models based on extended negatively dependent errors. Stat Pap 59(2):449–465

    Article  MathSciNet  Google Scholar 

  • Zhou XC, Lin JG (2014) Wavelet estimator in nonparametric regression model with dependent error’s structure. Commun Stat 43:4707–4722

    Article  MathSciNet  Google Scholar 

  • Zhou XC, Lin JG (2015) Asymptotics of a wavelet estimator in the nonparametric regression model with repeated measurements under a NA error process. RACSAM 109(1):153–168

    Article  MathSciNet  Google Scholar 

  • Zhou XC, Lin JG, Yin CM (2013) Asymptotic properties of wavelet-based estimator in nonparametric regression model with weakly dependent processes. J Inequal Appl 2013, Article ID 261

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuejun Wang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supported by the National Natural Science Foundation of China (11671012, 11871072, 11701004, 11701005), the Natural Science Foundation of Anhui Province (1808085QA03, 1908085QA01, 1908085QA07) and the Project on Reserve Candidates for Academic and Technical Leaders of Anhui Province (2017H123).

Appendix

Appendix

Proof of Corollary 3.1

It is easily checked that

$$\begin{aligned} E\left| V^{(j)}(x)\right| ^q= & {} E\left| \sum _{i=1}^ne^{(j)}(x_i)\int _{A_i}E_k(x,s)ds\right| ^q\nonumber \\\le & {} E\left( \sum _{i=1}^n\frac{\int _{A_i}|E_k(x,s)|ds}{\int _{0}^1|E_k(x,s)|ds}\cdot |e^{(j)}(x_i)|\int _{0}^1|E_k(x,s)|ds\right) ^q\nonumber \\\le & {} \sum _{i=1}^n\frac{\int _{A_i}|E_k(x,s)|ds}{\int _{0}^1|E_k(x,s)|ds}E\left( |e^{(j)}(x_i)|\int _{0}^1|E_k(x,s)|ds\right) ^q\nonumber \\\le & {} \sup _{j\ge 1,0\le x\le 1}E\left| e^{(j)}(x)\right| ^q\cdot \left( \sup _{0\le x\le 1}\int _{0}^1|E_k(x,s)|ds\right) ^q\nonumber \\\le & {} C\sup _{j\ge 1,0\le x\le 1}E\left| e^{(j)}(x)\right| ^q, \end{aligned}$$
(4.1)

where the last inequality above follows by Lemma 3.1 (ii).

Noting that

$$\begin{aligned} V^{(j)}(x)= & {} \sum _{i=1}^ne^{(j)}(x_i)\int _{A_i}E_k(x,s)ds\\= & {} \sum _{i=1}^ne^{(j)}(x_i)\left( \int _{A_i}E_k(x,s)ds\right) ^+-\sum _{i=1}^ne^{(j)}(x_i)\left( \int _{A_i}E_k(x,s)ds\right) ^-\\\doteq & {} V^{(j)}_+(x)-V^{(j)}_-(x), \end{aligned}$$

we have by \(C_r\)-inequality that

$$\begin{aligned} E\left| \sum _{j=1}^mV^{(j)}(x)\right| ^q\le CE\left| \sum _{j=1}^mV^{(j)}_+(x)\right| ^q+CE\left| \sum _{j=1}^mV^{(j)}_-(x)\right| ^q. \end{aligned}$$
(4.2)

By Lemma 3.3, we can see that \(\{V^{(1)}_+(x),V^{(2)}_+(x),\ldots ,V^{(m)}_+(x)\}\) are still zero mean END random variables. Hence, it follows by Lemma 3.4 and (4.1) that

$$\begin{aligned} E\left| \sum _{j=1}^mV^{(j)}_+(x)\right| ^q\le & {} C_q \left[ \sum _{j=1}^mE\left| V^{(j)}_+(x)\right| ^q+\left( \sum _{j=1}^mE\left| V^{(j)}_+(x)\right| ^2\right) ^{q/2}\right] \nonumber \\\le & {} C_q \left[ \sum _{j=1}^mE\left| V^{(j)}(x)\right| ^q+\left( \sum _{j=1}^mE\left| V^{(j)}(x)\right| ^2\right) ^{q/2}\right] \nonumber \\\le & {} C_q\left[ m\sup _{j\ge 1,0\le x\le 1}E\left| e^{(j)}(x)\right| ^q\nonumber \right. \\&\left. +\,m^{q/2}\left( \sup _{j\ge 1,0\le x\le 1}E\left| e^{(j)}(x)\right| ^2\right) ^{q/2}\right] . \end{aligned}$$
(4.3)

Similarly, we have

$$\begin{aligned} E\left| \sum _{j=1}^mV^{(j)}_-(x)\right| ^q\le & {} C_q\left[ m\sup _{j\ge 1,0\le x\le 1}E\left| e^{(j)}(x)\right| ^q\nonumber \right. \\&\left. +\,m^{q/2}\left( \sup _{j\ge 1,0\le x\le 1}E\left| e^{(j)}(x)\right| ^2\right) ^{q/2}\right] . \end{aligned}$$
(4.4)

Therefore, the desired result (3.23) follows by (4.2)–(4.4) immediately. This completes the proof of the corollary. \(\square \)

Proof of Lemma 3.5

Noting that \(\alpha p>1\), we take a suitable q such that \(\frac{1}{ \alpha p }< q<1\). For fixed \(n\ge 1\), denote for \(1\le i\le n\) that

$$\begin{aligned} X_{ni} ^{(1)}= & {} - n^{\alpha q} I(X_{i}<-n^{\alpha q})+X_{i} I(|X_{i} |\le n^{\alpha q}) + n^{\alpha q} I(X_{i}>n^{\alpha q}),\\ X_{ni} ^{(2)}= & {} ( X_{i} -n^{\alpha q} )I(X_{i} >n^{\alpha q}), \\ X_{ni} ^{(3)}= & {} (X_{i} +n^{\alpha q} ) I(X_{i} <-n^{\alpha q}). \end{aligned}$$

Noting that

$$\begin{aligned} \sum _{i=1}^jX_i=\sum _{i=1}^jX_{ni} ^{(1)}+\sum _{i=1}^jX_{ni} ^{(2)}+\sum _{i=1}^jX_{ni} ^{(3)} \end{aligned}$$

for \(1\le j\le n\), we have that for all \(\varepsilon >0\),

$$\begin{aligned} \sum _{n=1}^{\infty } n^{\alpha p-2} P \left( \max _{1\le j \le n} \left| \sum _{i=1}^{j} X_{i} \right|>\varepsilon n^{\alpha } \right)\le & {} \sum _{n=1}^{\infty } n^{\alpha p-2} P \left( \max _{1\le j \le n} \left| \sum _{i=1}^{j} X_{ni} ^{(1)} \right|> \frac{\varepsilon n^{\alpha }}{3} \right) \nonumber \\&+\sum _{n=1}^{\infty } n^{\alpha p-2} P \left( \max _{1\le j \le n} \left| \sum _{i=1}^{j} X_{ni} ^{(2)} \right|> \frac{\varepsilon n^{\alpha }}{3} \right) \nonumber \\&+\sum _{n=1}^{\infty } n^{\alpha p-2} P \left( \max _{1\le j \le n} \left| \sum _{i=1}^{j} X_{ni} ^{(3)} \right| > \frac{\varepsilon n^{\alpha }}{3} \right) \nonumber \\\doteq & {} I_{1}+ I_{2}+I_3. \end{aligned}$$
(4.5)

Hence, in order to prove (3.2), it suffices to show that \(I_1<\infty \), \(I_2<\infty \) and \(I_3<\infty \).

For \(I_1\), we firstly show that

$$\begin{aligned} n^{-\alpha } \max _{1\le j \le n} \left| \sum _{i=1}^{j} E X_{ni} ^{(1)} \right| \rightarrow 0,~~\text {as}~~n\rightarrow \infty . \end{aligned}$$
(4.6)

It follows by \(EX_{n} = 0\), Markov’s inequality and Property 1.1 that

$$\begin{aligned} n^{-\alpha } \max _{1\le j \le n} \left| \sum _{i=1}^{j} E X_{ni} ^{(1)} \right|\le & {} n^{-\alpha } \sum _{i=1}^{n} [E |X_{i}| I ( |X_{i}|> n^{\alpha q}) + n^{\alpha q} P( |X_{i}|> n^{\alpha q})] \\\le & {} C n^{-\alpha } \sum _{i=1}^{n} [E |X| I ( |X|> n^{\alpha q}) + n^{\alpha q} P( |X|> n^{\alpha q})] \\\le & {} C n^{-\alpha +1+\alpha q-\alpha pq} E |X|^p I ( |X|> n^{\alpha q}) \\&+\, Cn^{-\alpha +1+\alpha q-\alpha pq} E |X|^{p} \\\le & {} Cn^{-\alpha +1+\alpha q-\alpha pq} E |X|^{p}, \end{aligned}$$

which together with \(E |X|^{p} <\infty \) and \(\frac{1}{ \alpha p }< q<1\) yields (4.6). Hence, we have by (4.6) that

$$\begin{aligned} I_{1} \le C\sum _{n=1}^{\infty } n^{\alpha p-2} P \left( \max _{1\le j \le n} \left| \sum _{i=1}^{j} (X_{ni} ^{(1)} - EX_{ni} ^{(1)}) \right| >\frac{\varepsilon n^{\alpha }}{6} \right) . \end{aligned}$$
(4.7)

For fixed \(n\ge 1\), we can see that \(\{X_{ni} ^{(1)}-EX_{ni} ^{(1)}, 1\le i\le n\}\) are still END random variables by Lemma 3.3. It follows by (4.7), Markov’s inequality and Lemma 3.4 that for any \(\delta \ge 2\),

$$\begin{aligned} I_{1}\le & {} C \sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta } E \left( \max _{1\le j \le n} \left| \sum _{i=1}^{j} \left( X_{ni} ^{(1)} - EX_{ni} ^{(1)}\right) \right| \right) ^{\delta } \nonumber \\\le & {} C \sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta } \log ^{\delta } n\left[ \sum _{i=1}^{n} E\left| X_{ni} ^{(1)}\right| ^{\delta } + \left( \sum _{i=1}^{n} E\left| X_{ni} ^{(1)}\right| ^{2} \right) ^{\delta /2} \right] \nonumber \\\doteq & {} CI_{11} +CI_{12}. \end{aligned}$$
(4.8)

Taking \(\delta >\max \left\{ \frac{\alpha p-1}{\alpha -1/2}, 2, p\right\} \), we have that

$$\begin{aligned} \alpha p-1-\alpha \delta + \alpha q\delta -\alpha pq=\alpha (p-\delta )(1-q)-1<-1 \end{aligned}$$

and

$$\begin{aligned} \alpha p-2-\alpha \delta +\frac{\delta }{2}<-1. \end{aligned}$$

It follows by \(C_r\)-inequality, Markov’s inequality and Property 1.1 that

$$\begin{aligned} I_{11}\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta }\log ^{\delta } n \sum _{i=1}^{n} [E |X_{i}|^{\delta } I ( |X_{i}| \le n^{\alpha q}) + n^{\alpha q \delta } P( |X_{i}|> n^{\alpha q})] \nonumber \\\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-1-\alpha \delta } \log ^{\delta } n [E |X|^{\delta } I ( |X| \le n^{\alpha q}) + n^{\alpha q \delta } P( |X| > n^{\alpha q})] \nonumber \\\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-1-\alpha \delta + \alpha q\delta -\alpha pq} E|X|^p \log ^{\delta } n \nonumber \\< & {} \infty \end{aligned}$$
(4.9)

and

$$\begin{aligned} I_{12}\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta }\log ^{\delta } n \left\{ \sum _{i=1}^{n} \left[ E X_{i}^{2} I ( |X_{i}| \le n^{\alpha q}) + n^{2\alpha q} P( |X_{i}|> n^{\alpha q})\right] \right\} ^{\delta /2} \nonumber \\\le & {} C \sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta +\frac{\delta }{2} } \log ^{\delta } n [E X^{2} I ( |X| \le n^{\alpha q}) + n^{2\alpha q} P( |X| > n^{\alpha q})] ^{\delta /2} \nonumber \\\le & {} \left\{ \begin{array}{lll} C\sum \limits _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta +\frac{\delta }{2} } (E X^{2}) ^{\delta /2} \log ^{\delta } n ,&{}\quad \text {if}~~p\ge 2\\ C\sum \limits _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta +\frac{\delta }{2}+\alpha (2-p) \frac{\delta }{2} } (E |X|^{p}) ^{\delta /2} \log ^{\delta } n,&{}\quad \text {if}~~1\le p<2\\ \end{array}\right. \nonumber \\= & {} \left\{ \begin{array}{lll} C\sum \limits _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta +\frac{\delta }{2} } (E X^{2}) ^{\delta /2} \log ^{\delta } n,&{}\quad \text {if}~~p\ge 2\\ C\sum \limits _{n=1}^{\infty } n^{(\alpha p-1)(1-\delta /2)-1 } (E |X|^{p}) ^{\delta /2} \log ^{\delta } n,&{}\quad \text {if}~~1\le p<2\\ \end{array}\right. \nonumber \\< & {} \infty . \end{aligned}$$
(4.10)

Hence, \(I_1<\infty \) follows by (4.8)–(4.10) immediately.

In the following, we will show that \(I_{2}<\infty \). For fixed \(n\ge 1\), denote for \(1\le i\le n\) that

$$\begin{aligned} X_{ni} ^{(4)} = (X_{i}- n^{\alpha q} ) I (n^{\alpha q} < X_{i} \le n^{\alpha }+ n^{\alpha q} ) + n^{\alpha } I ( X_{i} > n^{\alpha }+ n^{\alpha q} ). \end{aligned}$$

It is easily checked that

$$\begin{aligned} \left( \max _{1\le j\le n}\left| \sum _{i=1}^{j} X_{ni} ^{(2)}\right|> \frac{\varepsilon n^{\alpha }}{3} \right) \subset \left( \max _{1\le i\le n}X_{i}> n^{\alpha }\right) \bigcup \left( \max _{1\le j\le n}\left| \sum _{i=1}^{j} X_{ni} ^{(4)}\right| > \frac{\varepsilon n^{\alpha }}{3} \right) , \end{aligned}$$

which implies that

$$\begin{aligned} I_{2}\le & {} \sum _{n=1}^{\infty } n^{\alpha p-2} \sum _{i=1}^{n} P(|X_{i}|> n^{\alpha }) + \sum _{n=1}^{\infty } n^{\alpha p-2} P \left( \max _{1\le j\le n}\left| \sum _{i=1}^{j} X_{ni} ^{(4)}\right| > \frac{\varepsilon n^{\alpha }}{3} \right) \nonumber \\\doteq & {} I_{21}+ I_{22}. \end{aligned}$$
(4.11)

It follows by \(E|X|^{p} < \infty \) that

$$\begin{aligned} I_{21} \le C\sum _{n=1}^{\infty } n^{\alpha p-1} P(|X|> n^{\alpha }) \le C E|X|^{p} < \infty . \end{aligned}$$
(4.12)

Noting that \(\frac{1}{\alpha p}<q<1\), we have by the definition of \(X_{ni} ^{(4)}\) and Property 1.1 that

$$\begin{aligned} n^{-\alpha } \max _{1\le j \le n} \left| \sum _{i=1}^{j} E X_{ni} ^{(4)} \right|\le & {} C n^{1-\alpha } E|X|I(|X|> n^{\alpha q}) \nonumber \\\le & {} Cn ^{1-\alpha +\alpha q- \alpha pq } E|X|^{p} \rightarrow 0,~~\text {as}~~n\rightarrow \infty . \end{aligned}$$
(4.13)

Since \(X_{ni} ^{(4)}>0\), we have by (4.11)–(4.13) that

$$\begin{aligned} I_{2}\le C\sum _{n=1}^{\infty } n^{\alpha p-2} P \left( \left| \sum _{i=1}^{n} \left( X_{ni} ^{(4)} - EX_{ni} ^{(4)} \right) \right| >\frac{\varepsilon n^{\alpha }}{6} \right) . \end{aligned}$$
(4.14)

For fixed \( n\ge 1\), we can see that \(\{X_{ni} ^{(4)}-EX_{ni} ^{(4)}, 1\le i\le n\}\) are still END random variables by Lemma 3.3. It follows by Markov’s inequality, \(C_r\) inequality and Lemma 3.4 that

$$\begin{aligned} I_{2}\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta } E \left| \sum _{i=1}^{n} \left( X_{ni} ^{(4)} - EX_{ni} ^{(4)} \right) \right| ^{\delta } \nonumber \\\le & {} C \sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta } \left[ \sum _{i=1}^{n} E|X_{ni} ^{(4)}|^{\delta } + \left( \sum _{i=1}^{n} E(X_{ni} ^{(4)})^{2} \right) ^{\delta /2} \right] \nonumber \\\doteq & {} J_{1}+ J_{2}. \end{aligned}$$
(4.15)

By \(C_r\)-inequality and Property 1.1 again, we can get that

$$\begin{aligned} J_{1}\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta } \sum _{i=1}^{n} \left[ E |X_{i}- n^{\alpha q}|^{\delta } I (n^{\alpha q}< X_{i} \le n^{\alpha }\right. \nonumber \\&\left. +\, n^{\alpha q} ) + n^{\alpha \delta } P ( X_{i}> n^{\alpha }+ n^{\alpha q} ) \right] \nonumber \\\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta } \sum _{i=1}^{n} \left[ E |X_{i}|^{\delta } I(|X_{i}| \le 2 n^{\alpha }) + n^{\alpha \delta } P ( X_{i}> n^{\alpha } )\right] \nonumber \\\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-2-\alpha \delta } \sum _{i=1}^{n}\left[ E |X|^{\delta } I(|X| \le 2 n^{\alpha }) + n^{\alpha \delta } P ( |X |> n^{\alpha } ) \right] \nonumber \\\le & {} C\sum _{n=1}^{\infty } n^{\alpha p-1-\alpha \delta } E |X|^{\delta } I(|X| \le 2 n^{\alpha }) +CE|X|^p \nonumber \\= & {} C\sum _{n=1}^{\infty } n^{\alpha p-1-\alpha \delta } \sum _{i=1}^nE |X|^{\delta } I(2(i-1)^\alpha<|X| \le 2 i^{\alpha }) +CE|X|^p \nonumber \\\le & {} C \sum _{i=1}^\infty i^{\alpha \delta } P(2(i-1)^\alpha<|X| \le 2 i^{\alpha })\sum _{n=i}^{\infty } n^{\alpha p-1-\alpha \delta } +CE|X|^p \nonumber \\\le & {} C \sum _{i=1}^\infty i^{\alpha p} P(2(i-1)^\alpha<|X| \le 2 i^{\alpha }) +CE|X|^p \nonumber \\\le & {} CE|X|^p ~< ~\infty . \end{aligned}$$
(4.16)

Similar to the proofs of (4.10) and (4.16), we can obtain that \(J_2<\infty \), which together with (4.15) and (4.16) yields that \(I_2<\infty \).

Similar to the proof of \(I_2<\infty \), one can get that \(I_3<\infty \). Hence, (3.2) follows by (4.5) and \(I_1<\infty \), \(I_2<\infty \) and \(I_3<\infty \) immediately. By the standard method, one can get (3.3) by (3.2) immediately. This completes the proof of the theorem. \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, X., Wu, Y., Wang, R. et al. On consistency of wavelet estimator in nonparametric regression models. Stat Papers 62, 935–962 (2021). https://doi.org/10.1007/s00362-019-01117-8

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00362-019-01117-8

Keywords

Mathematics Subject Classification

Navigation