## 1 Introduction

Let $\left(\mathrm{\Omega },\mathcal{F},\mathcal{P}\right)$ be a probability space and $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of random variables defined on it.

A finite family of random variables $\left\{{X}_{i},1\le i\le n\right\}$ is said to be negatively associated (NA) if for every pair of disjoint subsets $A,B\subset \left\{1,2,\dots ,n\right\}$ and any real coordinatewise nondecreasing functions f on ${R}^{A}$ and g on ${R}^{B}$

$Cov\left(f\left({X}_{i},i\in A\right),g\left({X}_{j},j\in B\right)\right)\le 0.$
(1.1)

An infinite family of random variables is negatively associated if every finite subfamily is negatively associated. This concept was introduced by Joag-Dev and Proschan [1].

A new kind of dependence structure called asymptotically negative association was proposed by Zhang [2, 3] which is a useful weakening of the definition of negative association (see also Yuan and Wu [4]).

Definition (Yuan and Wu [4])

A sequence $\left\{{X}_{n},n\ge 1\right\}$ of random variables is said to be asymptotically negatively associated (ANA) if

(1.2)

where

${\rho }^{-}\left(S,T\right)=0\vee \left\{\frac{Cov\left(f\left({X}_{i},i\in S\right),g\left({X}_{j},j\in T\right)\right)}{{\left(Varf\left({X}_{i},i\in S\right)\right)}^{\frac{1}{2}}{\left(Varg\left({X}_{j},j\in T\right)\right)}^{\frac{1}{2}}},f,g\in \mathcal{C}\right\},$
(1.3)

and $\mathcal{C}$ is the set of nondecreasing functions.

It is obvious that a sequence of asymptotically negatively associated random variables is negatively associated if and only if ${\rho }^{-}\left(1\right)=0$. Compared to negative association, asymptotically negative association defines a strictly larger class of random variables (for detailed examples, see Zhang [2]).

Consequently, the study of the limit theorems for asymptotically negatively associated random variables is of much interest.

For example, Zhang [3] proved the central limit theorem, Wang and Lu [5] obtained some inequalities of the maximum of partial sums and weak convergence, Wang and Zhang [6] established the law of the iterated logarithm, and Yuan and Wu [4] showed the limiting behavior of the maximum of partial sums.

Hájek and Rènyi [7] proved that if $\left\{{X}_{n},n\ge 1\right\}$ is a sequence of independent random variables with $E{X}_{n}=0$ and $E{X}_{n}^{2}<\mathrm{\infty }$, $n\ge 1$, and $\left\{{b}_{n},n\ge 1\right\}$ is a sequence of positive nondecreasing real numbers, then for any $ϵ>0$ and for any positive integer $m,

$P\left(\underset{m\le j\le n}{max}|\frac{{\sum }_{i=1}^{j}{X}_{i}}{{b}_{j}}|\ge ϵ\right)\le {ϵ}^{-2}\left(\sum _{j=m+1}^{n}\frac{E{X}_{j}^{2}}{{b}_{j}^{2}}+\sum _{j=1}^{m}\frac{E{X}_{j}^{2}}{{b}_{m}^{2}}\right).$
(1.4)

Since then, this inequality has been of concern for more and more authors (e.g., Chow [8] and Gan [9] for martingales, Liu et al. [10] for negatively associated random variables and Kim et al. [11] for asymptotically almost negatively associated random variables).

Inspired by Kim et al. [11], we will obtain the Hájek-Rènyi inequality type for asymptotically negatively associated random variables and prove the strong law of large numbers by using this inequality.

## 2 Hájek-Rènyi inequality for ANA random variables

Lemma 2.1 (Yuan and Wu [4])

Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of asymptotically negatively associated (ANA) random variables and $\left\{{a}_{n},n\ge 1\right\}$ a sequence of positive numbers. Then $\left\{{a}_{n}{X}_{n},n\ge 1\right\}$ is still a sequence of ANA random variables.

From Wang and Lu’s [5] Rosenthal type inequality for asymptotically negatively associated random variables we obtain the following.

Lemma 2.2 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of asymptotically negatively associated random variables with ${\rho }^{-}\left(N\right)\le r$, $E{X}_{n}=0$, and $E{X}_{n}^{2}<\mathrm{\infty }$. Then, for all $n\ge 1$ there is a positive constant $D=D\left(2,N,r\right)$ such that

$E\underset{1\le i\le n}{max}{|\sum _{j=1}^{i}{X}_{j}|}^{2}\le D\left(\sum _{j=1}^{n}E{X}_{j}^{2}\right).$
(2.1)

Theorem 2.3 Let $\left\{{b}_{n},n\ge 1\right\}$ be a sequence of positive nondecreasing real numbers and $\left\{{X}_{1},\dots ,{X}_{n}\right\}$ a sequence of mean zero, square integrable ANA random variables. Let ${\sigma }_{k}^{2}=E{X}_{k}^{2}$. Then, for $ϵ>0$

$P\left\{\underset{1\le k\le n}{max}\left(\frac{{\sum }_{i=1}^{k}{X}_{i}}{{b}_{k}}\right)>ϵ\right\}\le 4D{ϵ}^{-2}\sum _{k=1}^{n}\frac{{\sigma }_{k}^{2}}{{b}_{k}^{2}},$
(2.2)

where D is a positive constant defined in Lemma  2.2.

Proof First note that $\left\{\frac{{X}_{1}}{{b}_{1}},\dots ,\frac{{X}_{n}}{{b}_{n}}\right\}$ is a sequence of mean zero, square integrable ANA random variables by Lemma 2.1. Thus $\left\{\frac{{X}_{1}}{{b}_{1}},\dots ,\frac{{X}_{n}}{{b}_{n}}\right\}$ satisfies (1.2) for all coordinatewise increasing continuous functions f and g. Without loss of generality, set ${b}_{0}=0$. Since

${b}_{k}^{-1}\sum _{j=1}^{k}\left({b}_{j}-{b}_{j-1}\right)=1,$

we get

$\sum _{j=1}^{k}{X}_{j}=\sum _{j=1}^{k}\left(\frac{{X}_{j}}{{b}_{j}}\sum _{i=1}^{j}\left({b}_{i}-{b}_{i-1}\right)\right)=\sum _{i=1}^{k}\left({b}_{i}-{b}_{i-1}\right)\sum _{i\le j\le k}\frac{{X}_{j}}{{b}_{j}}$

and

$\left\{|\frac{{\sum }_{j=1}^{k}{X}_{j}}{{b}_{k}}|\ge ϵ\right\}\subset \left\{\underset{1\le i\le k}{max}|\sum _{i\le j\le k}\frac{{X}_{j}}{{b}_{j}}|\ge ϵ\right\}.$
(2.3)

From (2.3) we have

$\begin{array}{rcl}\left\{\underset{1\le k\le n}{max}|\frac{{\sum }_{j=1}^{k}{X}_{j}}{{b}_{k}}|\ge ϵ\right\}& \subset & \left\{\underset{1\le k\le n}{max}\underset{1\le i\le k}{max}|\sum _{i\le j\le k}\frac{{X}_{j}}{{b}_{j}}|\ge ϵ\right\}=\left\{\underset{1\le i\le k\le n}{max}|\sum _{j\le k}\frac{{X}_{j}}{{b}_{j}}-\sum _{j

Hence by Lemma 2.2 the desired result (2.2) follows. □

From Theorem 2.3, we can get the following more generalized Hájek-Rènyi type inequality.

Theorem 2.4 Let $\left\{{b}_{n},n\ge 1\right\}$ be a sequence of positive nondecreasing real numbers. Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of mean zero and square integrable ANA random variables with ${\rho }^{-}\left(N\right)\le r$ and $E{X}_{n}^{2}<\mathrm{\infty }$. Let ${\sigma }_{k}^{2}=E{X}_{k}^{2}$, $k\ge 1$. Then, for $ϵ>0$ and for any positive integer $m we have

$P\left(\underset{m\le k\le n}{max}|\frac{{\sum }_{j=1}^{k}{X}_{j}}{{b}_{k}}|\ge ϵ\right)\le 16D{ϵ}^{-2}\left(\sum _{j=m+1}^{n}\frac{{\sigma }_{j}^{2}}{{b}_{j}^{2}}+\sum _{j=1}^{m}\frac{{\sigma }_{j}^{2}}{{b}_{m}^{2}}\right),$
(2.4)

where D is a positive constant defined in Lemma  2.2.

Proof By Theorem 2.3 we have

$\begin{array}{r}P\left\{\underset{m\le k\le n}{max}|\frac{{\sum }_{j=1}^{k}{X}_{j}}{{b}_{k}}|\ge ϵ\right\}\\ \phantom{\rule{1em}{0ex}}\le P\left\{|\frac{{\sum }_{j=1}^{m}{X}_{j}}{{b}_{m}}|\ge \frac{ϵ}{2}\right\}+P\left\{\underset{m+1\le k\le n}{max}|\frac{{\sum }_{j=m+1}^{k}{X}_{j}}{{b}_{k}}|\ge \frac{ϵ}{2}\right\}\\ \phantom{\rule{1em}{0ex}}\le P\left\{\frac{1}{{b}_{m}}\underset{1\le k\le m}{max}|\sum _{j=1}^{k}{X}_{j}|\ge \frac{ϵ}{2}\right\}+P\left\{\underset{m+1\le k\le n}{max}|\frac{{\sum }_{j=m+1}^{k}{X}_{j}}{{b}_{k}}|\ge \frac{ϵ}{2}\right\}\\ \phantom{\rule{1em}{0ex}}\le 16D{ϵ}^{-2}\left(\sum _{j=m+1}^{n}\frac{{\sigma }_{j}^{2}}{{b}_{j}^{2}}+\sum _{j=1}^{m}\frac{{\sigma }_{j}^{2}}{{b}_{m}^{2}}\right).\end{array}$

Hence the proof is complete. □

## 3 Strong law of large numbers for ANA random variables

Using the Hájek-Rènyi inequality for ANA random variables we will prove the strong law of large number for ANA random variables.

Theorem 3.1 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{b}_{n},n\ge 1\right\}$ be a sequence of positive nondecreasing real numbers and $\left\{{X}_{n},n\ge 1\right\}$ a sequence of mean zero, square integrable random variables with ${\rho }^{-}\left(N\right)\le r$ and $E{X}_{n}^{2}<\mathrm{\infty }$. Let ${\sigma }_{k}^{2}=E{X}_{k}^{2}$, $k\ge 1$, and ${S}_{n}={\sum }_{i=1}^{n}{X}_{i}$. Assume

$\sum _{k=1}^{\mathrm{\infty }}\frac{{\sigma }_{k}^{2}}{{b}_{k}^{2}}<\mathrm{\infty }.$
(3.1)

Then, for any $0

1. (A)

$E{sup}_{n}{\left(|{S}_{n}|/{b}_{n}\right)}^{p}<\mathrm{\infty }$,

2. (B)

$0<{b}_{n}↑\mathrm{\infty }$ implies ${S}_{n}/{b}_{n}\to 0$ a.s. as $n\to \mathrm{\infty }$.

Proof (A) Note that

$E{\left(\underset{n}{sup}\frac{|{S}_{n}|}{{b}_{n}}\right)}^{p}<\mathrm{\infty }\phantom{\rule{1em}{0ex}}⇔\phantom{\rule{1em}{0ex}}{\int }_{1}^{\mathrm{\infty }}P\left(\underset{n}{sup}\frac{|{S}_{n}|}{{b}_{n}}>{t}^{\frac{1}{p}}\right)\phantom{\rule{0.2em}{0ex}}dt<\mathrm{\infty }.$

By Theorem 2.3, it follows from (3.1) that

$\begin{array}{rcl}{\int }_{1}^{\mathrm{\infty }}P\left(\underset{n}{sup}\frac{|{S}_{n}|}{{b}_{n}}>{t}^{\frac{1}{p}}\right)\phantom{\rule{0.2em}{0ex}}dt& \le & 4D\underset{n\to \mathrm{\infty }}{lim}{\int }_{1}^{\mathrm{\infty }}{t}^{-2/p}\phantom{\rule{0.2em}{0ex}}dt\sum _{k=1}^{n}\frac{{\sigma }_{k}^{2}}{{b}_{k}^{2}}\\ =& 4D\underset{n\to \mathrm{\infty }}{lim}\sum _{k=1}^{n}\frac{{\sigma }_{k}^{2}}{{b}_{k}^{2}}{\int }_{1}^{\mathrm{\infty }}{t}^{-\frac{2}{p}}\phantom{\rule{0.2em}{0ex}}dt<\mathrm{\infty },\end{array}$

where D is a positive constant defined in Lemma 2.2.

Hence the proof of (A) is complete.

(B) By Theorem 2.4 we get

$P\left(\underset{1\le k\le n}{max}\frac{|{S}_{k}|}{{b}_{k}}\ge ϵ\right)\le 16D{ϵ}^{-2}\left(\sum _{j=m+1}^{n}\frac{{\sigma }_{j}^{2}}{{b}_{j}^{2}}+\sum _{j=1}^{m}\frac{{\sigma }_{j}^{2}}{{b}_{m}^{2}}\right).$

But by assumption (3.1) we have

$P\left\{\underset{k\ge m}{sup}\frac{|{S}_{k}|}{{b}_{k}}\ge ϵ\right\}=\underset{n\to \mathrm{\infty }}{lim}P\left\{\underset{m\le k\le n}{max}\frac{|{S}_{k}|}{{b}_{k}}\ge ϵ\right\}\le 16D{ϵ}^{-2}\left(\sum _{j=m+1}^{\mathrm{\infty }}\frac{{\sigma }_{j}^{2}}{{b}_{j}^{2}}+\sum _{j=1}^{m}\frac{{\sigma }_{j}^{2}}{{b}_{m}^{2}}\right).$
(3.2)

By the Kronecker lemma and (3.1) we get

(3.3)

Hence, by combining (3.1), (3.2), and (3.3) we have

$\underset{n\to \mathrm{\infty }}{lim}P\left\{\underset{k\ge n}{sup}\frac{|{S}_{k}|}{{b}_{k}}\ge ϵ\right\}=0,$

i.e., ${S}_{n}/{b}_{n}\to 0$ a.s. as $n\to \mathrm{\infty }$. □

Corollary 3.2 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of mean zero, square integrable ANA random variables with ${\rho }^{-}\left(N\right)\le r$ and $E{X}_{n}^{2}<\mathrm{\infty }$. Then, for $0

$P\left(\underset{k\ge m}{sup}\frac{|{S}_{k}|}{{k}^{1/t}}\ge ϵ\right)\le 4D{ϵ}^{-2}\frac{2}{2-t}\underset{k}{sup}{\sigma }_{k}^{2}{m}^{\left(t-2\right)/t},$

for all $ϵ\ge 0$ and $m\ge 1$, where D is a constant defined in Lemma  2.2,

${S}_{n}=\sum _{j=1}^{n}{X}_{j}\phantom{\rule{1em}{0ex}}\mathit{\text{and}}\phantom{\rule{1em}{0ex}}{\sigma }_{n}^{2}=E{X}_{n}^{2},\phantom{\rule{1em}{0ex}}n\ge 1.$

Corollary 3.3 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of mean zero and square integrable ANA random variables with ${\rho }^{-}\left(N\right)\le r$ and $E{X}_{n}^{2}<\mathrm{\infty }$. Assume that

$\underset{n}{sup}{\sigma }_{n}^{2}<\mathrm{\infty },$

where ${\sigma }_{n}^{2}=E{X}_{n}^{2}$, $n\ge 1$. Then, for $0

1. (A)

$\left({S}_{n}/{n}^{1/t}\right)\to 0$ a.s. as $n\to \mathrm{\infty }$,

2. (B)

$E{sup}_{n}{\left(|{S}_{n}|/{n}^{1/t}\right)}^{p}<\mathrm{\infty }$ for any $0, where ${S}_{n}={\sum }_{j=1}^{n}{X}_{j}$.

Finally, we consider almost convergence for weighted sums of ANA random variables as applications of Theorem 3.1.

Theorem 3.4 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ be an array of real numbers with ${a}_{ni}=0$, $i>n$, ${sup}_{n\ge 1}{\sum }_{i=1}^{n}|{a}_{ni}|<\mathrm{\infty }$, and $\left\{{b}_{n},n\ge 1\right\}$ be a sequence of positive nondecreasing real numbers such that $0<{b}_{n}↑\mathrm{\infty }$ and let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of mean zero, square integrable ANA random variables satisfying ${\rho }^{-}\left(N\right)\le r$, $E{X}_{n}^{2}<\mathrm{\infty }$, and (3.1). Then

(3.4)

Proof Define

Then we obtain

(3.5)
$\sum _{i=1}^{n}\frac{{a}_{ni}{X}_{i}}{{b}_{n}}=\sum _{i=1}^{n}{c}_{ni}{T}_{i},$
(3.6)
$\sum _{i=1}^{n}|{c}_{ni}|\le 2\underset{n\ge 1}{sup}\sum _{i=1}^{n}|{a}_{ni}|<\mathrm{\infty },$
(3.7)

and

(3.8)

Note that if an array of real numbers $\left\{{c}_{ni},1\le i\le n,n\ge 1\right\}$ satisfies ${\sum }_{i=1}^{n}|{c}_{ni}|<\mathrm{\infty }$ and ${lim}_{n\to \mathrm{\infty }}|{c}_{ni}|=0$ for every fixed i then, for every sequence of real numbers ${d}_{n}$ with ${d}_{n}\to 0$ as $n\to \mathrm{\infty }$

(3.9)

(See Kim et al. [11] for more details.) □

Hence, from the above fact and (3.5)-(3.9), the desired result (3.4) follows.

Theorem 3.5 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ be an array of real numbers with ${a}_{ni}=0$, $i>n$, ${sup}_{n\ge 1}{\sum }_{i=1}^{n}|{a}_{ni}|<\mathrm{\infty }$, and $\left\{{b}_{n},n\ge 1\right\}$ be a sequence of positive nondecreasing real numbers such that $0<{b}_{n}↑\mathrm{\infty }$ and let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of mean zero, square integrable ANA random variables with ${\rho }^{-}\left(N\right)\le r$ and ${sup}_{n}{\sigma }_{n}^{2}<\mathrm{\infty }$, where ${\sigma }_{n}^{2}=E{X}_{n}^{2}$, $n\ge 1$. Then, for some $0

Proof By putting ${b}_{n}={n}^{1/t}$ from Corollary 3.3 and Theorem 3.4, the result follows and the proof is omitted. □

Now we prove the Marcinkiewicz strong law of large numbers for ANA random variables by using Theorem 3.1. The method of proof is the same as that used in the classical Marcinkiewicz strong law of large numbers for i.i.d. random variables (see Stout [[12], Theorem 3.2.3]).

Theorem 3.6 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed ANA random variables with $E{X}_{1}=0$, $E{|{X}_{1}|}^{t}<\mathrm{\infty }$ for some $0 and ${\rho }^{-}\left(N\right)\le r$. Then

(3.10)

Sketch of proof To prove (3.10) it suffices to show that

(3.11)

and

(3.12)

where ${X}_{j}^{+}=max\left({X}_{j},0\right)$ and ${X}_{j}^{-}=max\left(-{X}_{j},0\right)$.

Note that $\left\{{X}_{j}^{+},j\ge 1\right\}$ and $\left\{{X}_{j}^{-},j\ge 1\right\}$ are sequences of identically distributed ANA random variables. We only show (3.11). Equation (3.12) can be proved similarly.

Set ${Y}_{j}={X}_{j}^{+}\wedge {n}^{\frac{1}{t}}$, $j=1,2,\dots ,n$ . Then $\left\{{Y}_{j},1\le j\le n\right\}$ is a sequence of identically distributed ANA random variables.

Note that $E{|{X}_{1}|}^{t}<\mathrm{\infty }⇔{\sum }_{n=1}^{\mathrm{\infty }}P\left(|{X}_{1}|>{n}^{\frac{1}{t}}\right)<\mathrm{\infty }$.

$P\left({Y}_{j}\ne {X}_{j}^{+}\right)=P\left({X}_{1}^{+}\wedge {n}^{\frac{1}{t}}\ne {X}_{1}^{+}\right)\le P\left({X}_{1}^{+}>{n}^{\frac{1}{t}}\right)\le P\left(|{X}_{1}|>{n}^{\frac{1}{t}}\right).$

So

(3.13)

We will prove

(3.14)

Notice that

$\begin{array}{rcl}\sum _{n=1}^{\mathrm{\infty }}\frac{E{Y}_{n}}{{n}^{\frac{1}{t}}}& =& \sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{1}{t}}\left(E{X}_{1}^{+}I\left({X}_{1}^{+}\le {n}^{\frac{1}{t}}\right)+{n}^{\frac{1}{t}}P\left({X}_{1}^{+}>{n}^{\frac{1}{t}}\right)\right)\\ =& \sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{1}{t}}E{X}_{1}^{+}I\left({X}_{1}^{+}\le {n}^{\frac{1}{t}}\right)+\sum _{n=1}^{\mathrm{\infty }}P\left({X}_{1}^{+}>{n}^{\frac{1}{t}}\right)\\ \le & \sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{1}{t}}\sum _{k=1}^{n}E{X}_{1}^{+}I\left({\left(k-1\right)}^{\frac{1}{t}}<{X}_{1}^{+}\le {k}^{\frac{1}{t}}\right)+\sum _{n=1}^{\mathrm{\infty }}P\left(|{X}_{1}|>{n}^{\frac{1}{t}}\right)\\ \le & \sum _{k=1}^{\mathrm{\infty }}E{X}_{1}^{+}I\left({\left(k-1\right)}^{\frac{1}{t}}<{X}_{1}^{+}\le {k}^{\frac{1}{t}}\right)\sum _{n=k}^{\mathrm{\infty }}{n}^{-\frac{1}{t}}+E{|{X}_{1}|}^{t}\\ \le & C\sum _{k=1}^{\mathrm{\infty }}{k}^{-\frac{1}{t}+1}E{X}_{1}^{+}I\left({\left(k-1\right)}^{\frac{1}{t}}<{X}_{1}^{+}\le {k}^{\frac{1}{t}}\right)+E{|{X}_{1}|}^{t}\\ \le & CE{|{X}_{1}|}^{t}<\mathrm{\infty }.\end{array}$
(3.15)

By Kronecker’s lemma and (3.15) we see that (3.14) is true.

We also have

$\begin{array}{r}\sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{2}{t}}E{\left({Y}_{n}-E{Y}_{n}\right)}^{2}\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{2}{t}}E{Y}_{n}^{2}\\ \phantom{\rule{1em}{0ex}}=C\sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{2}{t}}E{\left({X}_{1}^{+}\wedge {n}^{\frac{1}{t}}\right)}^{2}\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{2}{t}}E{\left({X}_{1}^{+}\right)}^{2}I\left({X}_{1}^{+}\le {n}^{\frac{1}{t}}\right)+\sum _{n=1}^{\mathrm{\infty }}P\left({X}_{1}^{+}>{n}^{\frac{1}{t}}\right)\\ \phantom{\rule{1em}{0ex}}\le C\sum _{n=1}^{\mathrm{\infty }}{n}^{-\frac{2}{t}}\sum _{k=1}^{n}E{\left({X}_{1}^{+}\right)}^{2}I\left({\left(k-1\right)}^{\frac{1}{t}}<{X}_{1}^{+}\le {k}^{\frac{1}{t}}\right)+E{|{X}_{1}|}^{t}\\ \phantom{\rule{1em}{0ex}}=C\sum _{k=1}^{\mathrm{\infty }}E{\left({X}_{1}^{+}\right)}^{2}I\left({\left(k-1\right)}^{\frac{1}{t}}<{X}_{1}^{+}\le {k}^{\frac{1}{t}}\right)\sum _{n=k}^{\mathrm{\infty }}{n}^{-\frac{2}{t}}+E{|{X}_{1}|}^{t}\\ \phantom{\rule{1em}{0ex}}\le C\sum _{k=1}^{\mathrm{\infty }}{k}^{-\frac{2}{t}+1}E{\left({X}_{1}^{+}\right)}^{2}I\left({\left(k-1\right)}^{\frac{1}{t}}<{X}_{1}^{+}\le {k}^{\frac{1}{t}}\right)+E{|{X}_{1}|}^{t}\\ \phantom{\rule{1em}{0ex}}\le C\sum _{k=1}^{\mathrm{\infty }}{k}^{-\frac{2}{t}+1}{k}^{\frac{2}{t}-1}E{\left({X}_{1}^{+}\right)}^{t}I\left({\left(k-1\right)}^{\frac{1}{t}}<{X}_{1}^{+}\le {k}^{\frac{1}{t}}\right)+E{|{X}_{1}|}^{t}\\ \phantom{\rule{1em}{0ex}}\le CE{|{X}_{1}|}^{t}<\mathrm{\infty }.\end{array}$
(3.16)

By Theorem 3.1 and (3.13)-(3.16) the proof of (3.11) is complete. □

Theorem 3.7 Let $0\le r<\frac{1}{12}$ and N be a positive integer. Let $\left\{{a}_{ni},1\le i\le n,n\ge 1\right\}$ be an array of real numbers with ${sup}_{n\ge 1}{\sum }_{i=1}^{n}|{a}_{ni}|<\mathrm{\infty }$ and let $\left\{{X}_{n},n\ge 1\right\}$ be a sequence of identically distributed ANA random variables with random variables with ${\rho }^{-}\left(N\right)\le r$, $E{X}_{1}=0$, and $E{|{X}_{1}|}^{t}<\mathrm{\infty }$ for $0. Then, for some $0

(3.17)

Proof Basically, using the ideas in the proof of Theorem 3.4 and Theorem 3.6, we can obtain (3.17) and the proof is omitted. □