1 Introduction

A sequence of random variables \(\{X_{n}, n\geq1\}\) is said to converge completely to a constant c if \(\sum_{n=1}^{\infty}P(|X_{n}-c|>\epsilon )<\infty\) for all \(\epsilon>0\). This concept of complete convergence was introduced by Hsu and Robbins [2]. They proved that the sequence of arithmetic means of independent identically distributed random variables converges completely to the expected value of the summands, provided that the variance is finite. The converse theorem was proved by Erdös [3]. This Hsu-Robbins-Erdös’s result was generalized in different ways. Katz [4], Baum and Katz [5], and Chow [6] obtained a generalization of complete convergence for a sequence of independent identically distributed random variables with normalization of Marcinkiewicz-Zygmund type (see Gut [7]). Chow [8] first showed the complete moment convergence for a sequence of i.i.d. random variables by generalizing the result of Baum and Katz [5].

The concept of complete moment convergence is as follows. Let \(\{Y_{n}, n\geq1\}\) be a sequence of random variables, and \(a_{n}>0\), \(b_{n}>0\), \(q>0\).

If \(\sum_{n=1}^{\infty}a_{n} E\{b_{n}^{-1}|Y_{n}|-\epsilon\}_{+}^{q}<\infty\) for all \(\epsilon>0\), then \(\{Y_{n}, n\geq1\}\) is called complete moment convergence. It is well known that the complete moment convergence implies the complete convergence.

Since then, many investigations have been made. For example, Sung [9] obtained a moment inequality for the maximum partial sum of random variables and the complete moment convergence of i.i.d. random variables, Liang and Li [10] provided necessary and sufficient moment conditions for the complete moment convergence of identically distributed negatively associated random variables, and Wu et al. [11] established the complete moment convergence of identically distributed \(\rho^{*}\)-mixing random variables.

The concept of negative association was introduced by Joag-Dev and Proschan [12] in the following way. A finite family of random variables \(\{X_{i}, 1\leq i \leq n\}\) is said to be negatively associated if for every pair of disjoint nonempty subsets A and B of \(\{1,2,\ldots, n\}\) and any real coordinatewise nondecreasing functions f and g,

$$ \operatorname{Cov}\bigl(f(X_{i}, i\in A), g(X_{j}, j\in B)\bigr)\leq0 $$
(1.1)

whenever f and g are such that the covariance exists. An infinite family of random variables is negatively associated if every finite subfamily is negatively associated.

A sequence of random variables \(\{X_{n}, n\geq1\}\) is said to satisfy a weak mean dominating condition with mean dominating random variable X if for some \(c>0\),

$$ \frac{1}{n}\sum_{i=1}^{n} P\bigl(|X_{i}|>x\bigr)\leq cP\bigl(|X|>x\bigr) \quad \mbox{for all } x>0 \mbox{ and all } n\geq1. $$
(1.2)

Recently, Kuczmaszewska [1] proved the complete convergence for a sequence of negatively associated random variables satisfying (1.2) (see Lemma 2.3).

In this paper, we prove the complete moment convergence for sequences of negatively associated random variables satisfying a weak mean dominating condition. It provides an improvement of complete convergence for negatively associated random variables by Kuczmaszewska [1] and conditions for complete moment convergence for a nonstationary sequence of negatively associated random variables.

2 Preliminaries

In our further considerations, we need the following lemmas.

Lemma 2.1

(Shao [13])

Let \(\{X_{i}, 1\leq i \leq n\}\) be a sequence of negatively associated random variables with \(EX_{i}=0\) and \(E|X_{i}|^{q}<\infty\) for \(q>1\) and every \(1\leq i\leq n\). Then

$$ E\max_{1\leq k\leq n}\Biggl|\sum_{i=1}^{k} X_{i}\Biggr|^{q}\leq2^{3-q}\sum _{i=1}^{n} E|X_{i}|^{q} \quad \textit{for } 1< q\leq2 $$
(2.1)

and

$$ E\max_{1\leq k\leq n}\Biggl|\sum_{i=1}^{k} X_{i}\Biggr|^{q}\leq2\biggl(\frac{15 q}{\ln q}\biggr)^{q} \Biggl\{ \sum_{i=1}^{n} E|X_{i}|^{q}+ \Biggl(\sum_{i=1}^{n}EX_{i}^{2} \Biggr)^{\frac{q}{2}}\Biggr\} \quad\textit{for } q> 2. $$
(2.2)

Remark

In Lemma 2.1 for \(q=1\), we have

$$E\max_{1\leq k\leq n}\Biggl|\sum_{i=1}^{k} X_{i}\Biggr|\leq\sum_{i=1}^{n} E|X_{i}|. $$

Lemma 2.2

(Gut [7])

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables satisfying a weak mean dominating condition with mean dominating random variable X, that is, for some \(c>0\), (1.2) holds for all \(x>0\) and \(n\geq1\).

Let \(r>0\) and, for some \(A>0\),

$$\begin{aligned}& X_{i}^{\prime}=X_{i} I\bigl(|X_{i}|\leq A\bigr), \qquad X_{i}^{\prime\prime}=X_{i} I\bigl(|X_{i}|>A\bigr), \\& X_{i}^{*}=X_{i} I\bigl(|X_{i}|\leq A\bigr)-AI(X_{i}< -A)+AI(X_{i}>A) \end{aligned}$$

and

$$\begin{aligned}& X^{\prime}=X I\bigl(|X|\leq A\bigr), \qquad X^{\prime\prime}=X I\bigl(|X|>A\bigr), \\& X^{*}=X I\bigl(|X|\leq A\bigr)-AI(X< -A)+AI(X>A) . \end{aligned}$$

Then, for some constant \(C>0\),

  1. (i)

    if \(E|X|^{r}<\infty\), then \((n^{-1})\sum_{i=1}^{n} E|X_{i}|^{r}\leq CE|X|^{r}\),

  2. (ii)

    \((n^{-1})\sum_{i=1}^{n} E|X_{i}^{\prime}|^{r}\leq C(E|X^{\prime}|^{r}+A^{r} P(|X|>A))\) for all \(A>0\),

  3. (iii)

    \((n^{-1})\sum_{i=1}^{n} E|X_{i}^{\prime\prime}|^{r}\leq CE|X^{\prime\prime}|^{r}\) for all \(A>0\),

  4. (iv)

    \((n^{-1})\sum_{i=1}^{n} E|X_{i}^{*}|^{r}\leq CE|X^{*}|^{r}\) for all \(A>0\).

The following result is obtained by Theorem 2.1 of Kuczmaszewska [1].

Lemma 2.3

(Kuczmaszewska [1])

Let \(\alpha p>1\), \(p>0\), and \(\alpha>\frac{1}{2}\). Let \(\{X_{n}, n\geq1\}\) be a sequence of negatively associated random variables with \(EX_{n}=0\) for all \(n\geq 1\), and X be a random variable possibly defined on a different space satisfying condition (1.2) for all \(\epsilon>0\) and \(n\geq1\). Then

$$ E|X|^{p}< \infty $$
(2.3)

implies

$$ \sum_{n=1}^{\infty}n^{\alpha p-2}P\Biggl( \max_{1\leq j\leq n}\Biggl|\sum_{i=1}^{j} X_{i}\Biggr|>\epsilon n^{\alpha}\Biggr)< \infty \quad\textit{for all } \epsilon>0. $$
(2.4)

Lemma 2.4

Let \(\alpha p>1\), \(\alpha>\frac{1}{2}\), and \(p>0\). Let \(\{X_{n}, n\geq1\}\) be a sequence of negatively associated random variables with \(EX_{n}=0\) for all \(n\geq1\), and X be a random variable possibly defined on a different space satisfying condition (1.2) for all \(\epsilon>0\) and \(n\geq1\). Then (2.3) implies

$$ \sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|>u\Biggr)\,du< \infty. $$
(2.5)

Proof

In the case \(0< p<1\), let, for all \(u>0\),

$$X_{i}=X_{i}I\bigl[|X_{i}|\leq u\bigr]+X_{i}I\bigl[|X_{i}|> u\bigr]=X_{ui}+X_{ui}^{\prime} $$

and

$$S_{n}=\sum_{i=1}^{n} X_{i}I\bigl[|X_{i}|\leq u\bigr]+\sum_{i=1}^{n} X_{i}I\bigl[|X_{i}|>u\bigr]=\sum_{i=1}^{n} X_{ui}+\sum_{i=1}^{n} X_{ui}^{\prime}. $$

Then we obtain

$$\begin{aligned} &\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Bigl(\max_{1\leq j\leq n}|S_{j}|>u \Bigr)\,du \\ &\quad\leq\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq j\leq n}\Biggl| \sum_{i=1}^{j} X_{ui}\Biggr|> \frac{u}{2}\Biggr)\,du \\ &\qquad{} +\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq j\leq n}\Biggl| \sum_{i=1}^{j} X_{ui}^{\prime}\Biggr|> \frac{u}{2}\Biggr)\,du \\ &\quad:=I+J. \end{aligned}$$
(2.6)

For J, take \(q=p\) (\(0< p<1\)). By the Markov inequality, Lemma 2.2(iii), and \(E|X|^{p}<\infty\) we have that

$$\begin{aligned} J \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{q}{2}}E\Biggl(\max _{1\leq j\leq n}\Biggl|\sum_{i=1}^{j} X_{ui}^{\prime}\Biggr|\Biggr)^{\frac {q}{2}}\,du \quad \mbox{by Markov inequality} \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{q}{2}}\sum _{i=1}^{n} E\bigl|X_{ui}^{\prime}\bigr|^{\frac{q}{2}}\,du\quad \biggl(\frac {q}{2}< \frac{1}{2}\biggr) \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{q}{2}}\sum _{i=1}^{n} E|X_{i}|^{\frac{q}{2}}I\bigl(|X_{i}|>u\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{q}{2}}E|X|^{\frac{q}{2}}I\bigl(|X|>u\bigr)\,du \quad \mbox{by Lemma}~2.2(\mathrm{iii}) \\ =&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}\int _{m^{\alpha}}^{(m+1)^{\alpha}} u^{-\frac{p}{2}}E|X|^{\frac {p}{2}}I\bigl(|X|>u\bigr)\,du\quad (q=p) \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}\int _{m^{\alpha}}^{(m+1)^{\alpha}} m^{-\frac{\alpha p}{2}}E|X|^{\frac {p}{2}}I\bigl(|X|>u\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}\int _{m^{\alpha}}^{(m+1)^{\alpha}} m^{-\frac{\alpha p}{2}}E|X|^{\frac {p}{2}}I \bigl(|X|>m^{\alpha}\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}m^{-\frac{\alpha p}{2}+\alpha-1}E|X|^{\frac{p}{2}}I \bigl(|X|>m^{\alpha}\bigr) \\ =&C\sum_{m=1}^{\infty}m^{-\frac{\alpha p}{2}+\alpha-1}E|X|^{\frac {p}{2}}I \bigl(|X|>m^{\alpha}\bigr)\sum_{n=1}^{m} n^{\alpha p-1-\alpha} \\ \leq&C\sum_{m=1}^{\infty}m^{\frac{\alpha p}{2}-1} E|X|^{\frac {p}{2}}I\bigl(|X|>m^{\alpha}\bigr) \\ =&C\sum_{m=1}^{\infty}m^{\frac{\alpha p}{2}-1}\sum _{n=m}^{\infty}E|X|^{\frac{p}{2}}I \bigl(n^{\alpha}< |X|\leq(n+1)^{\alpha}\bigr) \\ \leq&C\sum_{n=1}^{\infty}E|X|^{\frac{p}{2}}I \bigl(n^{\alpha}< |X|\leq (n+1)^{\alpha}\bigr)\sum _{m=1}^{n} m^{\frac{\alpha p}{2}-1} \\ \leq&C\sum_{n=1}^{\infty}n^{\frac{\alpha p}{2}} E|X|^{\frac {p}{2}}I\bigl(n^{\alpha}< |X|\leq(n+1)^{\alpha}\bigr) \\ \leq&C E|X|^{p}< \infty. \end{aligned}$$
(2.7)

Similarly, for I, take \(q=p\) (\(0< p<1\)). By the Markov inequality and Lemma 2.2(ii) we have that

$$\begin{aligned} I \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{q}{2}}E\Biggl(\max _{1\leq j\leq n}\Biggl|\sum_{i=1}^{j} X_{ui}\Biggr|\Biggr)^{\frac {q}{2}}\,du \quad \mbox{by Markov inequality} \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{p}{2}}\sum _{i=1}^{n} E|X_{ui}|^{\frac{p}{2}}\,du \quad \biggl(q=p \mbox{ and } \frac{p}{2}< \frac{1}{2}\biggr) \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{p}{2}}\sum _{i=1}^{n} \bigl\{ E|X_{i}|^{\frac{p}{2}}I\bigl(|X_{i}| \leq u\bigr)\bigr\} \,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{p}{2}}E|X|^{\frac{p}{2}}I\bigl(|X| \leq u\bigr)\,du \\ &{} +C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}P\bigl(|X|>u\bigr)\,du \quad\mbox{by Lemma}~2.2(\mathrm{ii}) \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{p}{2}}E|X|^{\frac{p}{2}}I\bigl(|X| \leq u\bigr)\,du \\ &{}+C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-\frac{p}{2}}E|X|^{\frac{p}{2}}I\bigl(|X|>u\bigr)\,du \\ =&I_{1}+I_{2}. \end{aligned}$$

In the processing of (2.7), we obtain \(I_{2}<\infty\). It remains to prove that \(I_{1}<\infty\):

$$\begin{aligned} I_{1} =&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum_{m=n}^{\infty}\int _{m^{\alpha}}^{(m+1)^{\alpha}}u^{-\frac{p}{2}}E|X|^{\frac{p}{2}}I\bigl(|X| \leq u\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}m^{-\frac{\alpha p}{2}+\alpha-1} E|X|^{\frac{p}{2}}I\bigl(|X|\leq (m+1)^{\alpha}\bigr) \\ =&C\sum_{m=1}^{\infty}m^{-\frac{\alpha p}{2}+\alpha-1} E|X|^{\frac {p}{2}}I\bigl(|X|\leq(m+1)^{\alpha}\bigr)\sum _{n=1}^{m} n^{\alpha p-1-\alpha} \\ \leq&C\sum_{m=1}^{\infty}m^{\frac{\alpha p}{2}-1}E|X|^{\frac {p}{2}}I \bigl(|X|\leq(m+1)^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{\frac{\alpha p}{2}-1} \sum_{n\leq m} E|X|^{\frac{p}{2}}I\bigl(n^{\alpha}< |X| \leq(n+1)^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{\frac{\alpha p}{2}-1} \sum_{n\leq m}n^{\frac {\alpha p}{2}}P\bigl(n^{\alpha}< |X| \leq(n+1)^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{\alpha p}P \bigl(m^{\alpha}< |X|\leq(m+1)^{\alpha}\bigr) \leq CE|X|^{p}< \infty. \end{aligned}$$
(2.8)

Hence, from (2.6)-(2.8) the result (2.5) follows in the case \(0< p<1\).

In the case \(1\leq p < 2\), let \(Y_{ui}=X_{i}I(|X_{i}|\leq u)-uI(X_{i} < -u)+uI(X_{i}>u)\) for all \(u>0\), and \(Y_{ui}^{\prime}=X_{i}-Y_{ui}\), \(i\geq1\). Then we have

$$\begin{aligned} &\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|>u\Biggr)\,du \\ &\quad\leq\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} (Y_{ui}-EY_{ui})\Biggr|> \frac{u}{2}\Biggr)\,du \\ &\qquad{} +\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k}\bigl(Y_{ui}^{\prime}-EY_{ui}^{\prime} \bigr)\Biggr|>\frac {u}{2}\Biggr)\,du \quad\mbox{since }EX_{n}=0 \\ &\quad:=I^{\prime}+J^{\prime}. \end{aligned}$$
(2.9)

Note that \(\{Y_{ui}-EY_{ui}\}\) and \(\{Y_{ui}^{\prime}-EY_{ui}^{\prime}\}\) are sequences of negatively associated random variables.

For \(J^{\prime}\), take q such that \(1\leq p< q\leq2\). By the fact that \(|Y_{ui}^{\prime}|\leq|X_{i}|I(|X_{i}|>u)\), the Markov inequality, Lemma 2.1, (2.1), Lemma 2.2(iii), (2.3), and the \(C_{r}\)-inequality we have

$$\begin{aligned} J^{\prime} \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q} \sum _{i=1}^{n} E\bigl|Y_{ui}^{\prime}-EY_{ui}^{\prime}\bigr|^{q}\,du \quad\mbox{by}~(2.1) \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\sum _{i=1}^{n} E\bigl|Y_{ui}^{\prime}\bigr|^{q}\,du \quad\mbox{(by the $C_{r}$-inequality)} \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\sum _{i=1}^{n} E|X_{i}|^{q}I\bigl(|X_{i}|>u\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}E|X|^{q}I\bigl(|X|>u\bigr)\,du \quad\mbox{Lemma}~2.2(\mathrm{iii}) \\ =&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}\int _{m^{\alpha}}^{(m+1)^{\alpha}} u^{-q}E|X|^{q}I\bigl(|X|>u\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}m^{-\alpha q+\alpha-1}E|X|^{q}I \bigl(|X|>m^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{-\alpha q+\alpha-1}E|X|^{q}I \bigl(|X|>m^{\alpha}\bigr)\sum_{n=1}^{m} n^{\alpha p-1-\alpha} \\ =&C\sum_{m=1}^{\infty}m^{-\alpha(q-p)-1} E|X|^{q}I\bigl(|X|>m^{\alpha}\bigr) \\ =&C\sum_{m=1}^{\infty}m^{-\alpha(q-p)-1}\sum _{n=m}^{\infty}E|X|^{q}I \bigl(n^{\alpha}< |X|\leq(n+1)^{\alpha}\bigr) \\ \leq&C\sum_{n=1}^{\infty}E|X|^{q}I \bigl(n^{\alpha}< |X|\leq(n+1)^{\alpha}\bigr)\sum _{m=1}^{n} m^{-\alpha(q-p)-1} \\ \leq&C\sum_{n=1}^{\infty}E|X|^{p}I \bigl(n^{\alpha}< |X|\leq(n+1)^{\alpha}\bigr) \\ \leq&C E|X|^{p}< \infty. \end{aligned}$$
(2.10)

For \(I^{\prime}\), take q such that \(1< p< q\leq2\). By the Markov inequality, (1.2), Lemma 2.1, (2.1), the \(C_{r}\)-inequality, and Lemma 2.2(iv) we have

$$\begin{aligned} I^{\prime} \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}E\Biggl(\max _{1\leq k\leq n}\sum_{i=1}^{k}|Y_{ui}-EY_{ui}|^{q} \Biggr)\,du \quad\mbox{by Markov inequality} \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\sum _{i=1}^{n} E|Y_{ui}-EY_{ui}|^{q} \,du \quad\mbox{by}~(2.1) \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\sum _{i=1}^{n} E|Y_{ui}|^{q}\,du \quad \mbox{by the $C_{r}$-inequality} \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}E|X|^{q}I\bigl(|X| \leq u\bigr)\,du \\ &{} +C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}E|X|^{q}I\bigl(|X|>u\bigr)\,du \quad \mbox{by Lemma}~2.2(\mathrm{iv}) \\ =&I_{1}^{\prime}+I_{2}^{\prime}. \end{aligned}$$

According to the calculation of \(J^{\prime}\), we obtain \(I_{2}^{\prime}<\infty\) (see (2.10)). It remains to prove that \(I_{1}^{\prime}<\infty\). By taking q such that \(1\leq p< q\leq2\) we have

$$\begin{aligned} I_{1}^{\prime} =&C\sum _{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}\int _{m^{\alpha}}^{(m+1)^{\alpha}}u^{-q}E|X|^{q}I\bigl(|X| \leq u\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}m^{-\alpha q+\alpha-1} E|X|^{q}I\bigl(|X|\leq(m+1)^{\alpha}\bigr) \\ =&C\sum_{m=1}^{\infty}m^{-\alpha q+\alpha-1} E|X|^{q}I\bigl(|X|\leq (m+1)^{\alpha}\bigr)\sum _{n=1}^{m} n^{\alpha p-1-\alpha} \\ \leq&C\sum_{m=1}^{\infty}m^{-\alpha q+\alpha p-1}E|X|^{q}I \bigl(|X|\leq (m+1)^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{-\alpha q+\alpha p-1}\sum _{n\leq m} E|X|^{q}I\bigl(n^{\alpha}< |X| \leq(n+1)^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{\alpha p}P \bigl(m^{\alpha}< |X|\leq(m+1)^{\alpha}\bigr) \\ =&CE|X|^{p}< \infty. \end{aligned}$$
(2.11)

Hence, from (2.9)-(2.11) the result (2.5) follows in the case \(1\leq p< 2\).

In the case \(p\geq2\), we also obtain \(I^{\prime}\) and \(J^{\prime}\) of (2.9).

For \(I^{\prime}\), by the Markov inequality, the \(C_{r}\)-inequality, and Lemma 2.1 (2.2) we have that, for \(q>2\),

$$\begin{aligned} I^{\prime} \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}E\Biggl\{ \max _{1\leq j\leq n}\Biggl|\sum_{i=1}^{j}(Y_{ui}-EY_{ui})\Biggr| \Biggr\} ^{q} \,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\Biggl\{ \sum _{i=1}^{n}E|Y_{ui}-EY_{ui}|^{q} +\Biggl(\sum_{i=1}^{n}E|Y_{ui}-EY_{ui}|^{2} \Biggr)^{\frac{q}{2}}\Biggr\} \,du \quad\mbox{by}~(2.2) \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\sum _{i=1}^{n}E|Y_{ui}|^{q} \,du \\ &{} +C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\Biggl(\sum _{i=1}^{n} EY_{ui}^{2} \Biggr)^{\frac{q}{2}}\,du \quad\mbox{by the $C_{r}$-inequality} \\ =&I_{3}^{\prime}+I_{4}^{\prime}. \end{aligned}$$
(2.12)

We will consider \(I_{3}^{\prime}\) and \(I_{4}^{\prime}\) as follows.

Note that \(\alpha>\frac{1}{2}\), \(\alpha p>1\), and \(p\geq2\). Take \(q>\max(p, \frac{\alpha p-1}{\alpha-\frac{1}{2}})\), which implies that \(\alpha p-2-\alpha q+\frac{q}{2}<-1\).

By Lemma 2.2(iv) we have

$$\begin{aligned} I_{3}^{\prime} \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}E|X|^{q}I\bigl(|X| \leq u\bigr)\,du \\ &{}+C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}E|X|^{q}I\bigl(|X|> u\bigr)\,du \\ \leq&I_{31}^{\prime}+I_{32}^{\prime}. \end{aligned}$$

According to the calculation of \(J^{\prime}\), we obtain \(I_{32}^{\prime}<\infty\) (see (2.10)). It remains to prove \(I_{31}^{\prime}\):

$$\begin{aligned} I_{31}^{\prime} \leq&C\sum _{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}\int_{m^{\alpha}}^{(m+1)^{\alpha}} u^{-q}E|X|^{q}I\bigl(|X| \leq u\bigr)\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha}\sum _{m=n}^{\infty}m^{\alpha-1-\alpha q}E|X|^{q}I \bigl(|X|\leq(m+1)^{\alpha}\bigr) \\ =&C\sum_{m=1}^{\infty}m^{\alpha-1-\alpha q}E|X|^{q}I \bigl(|X|\leq(m+1)^{\alpha}\bigr)\sum_{m=1}^{n} m^{\alpha p-1-\alpha} \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-1-\alpha q}E|X|^{q}I \bigl(|X|\leq (m+1)^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{\alpha p-1-\alpha q}E|X|^{q}I \bigl(m^{\alpha}< |X|\leq(m+1)^{\alpha}\bigr) \\ &{} +C\sum_{m=1}^{\infty}m^{\alpha p-1-\alpha q}E|X|^{q}I \bigl(|X|\leq m^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}m^{-1} E|X|^{p}I\bigl(m^{\alpha}< |X|\leq(m+1)^{\alpha}\bigr) \\ &{} +C\sum_{m=1}^{\infty}m^{-\alpha(q-p)-1}\sum _{j=1}^{m} j^{\alpha q}P \bigl((j-1)^{\alpha}< |X|\leq j^{\alpha}\bigr) \\ \leq&C\sum_{m=1}^{\infty}E|X|^{p}I \bigl(m^{\alpha}< |X|\leq(m+1)^{\alpha}\bigr) \\ &{} +C\sum_{j=1}^{\infty}j^{\alpha q}P \bigl((j-1)^{\alpha}< |X|\leq j^{\alpha}\bigr)\sum _{m=j}^{\infty}m^{-\alpha(q-p)-1} \\ \leq&CE|X|^{p}+C\sum_{j=1}^{\infty}j^{\alpha p}P\bigl((j-1)^{\alpha}< |X|\leq j^{\alpha}\bigr) \\ \leq&CE|X|^{p}< \infty, \end{aligned}$$
(2.13)

which yields \(I_{3}^{\prime}<\infty\).

By the fact that \(EX^{2}<\infty\) and \(E|X|^{p}<\infty\) we have for \(p\geq2\):

$$\begin{aligned} I_{4}^{\prime} \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\Biggl(\sum _{i=1}^{n}EX_{i}^{2} \Biggr)^{\frac{q}{2}}\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}u^{-q}\bigl(n EX^{2}\bigr)^{\frac{q}{2}}\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha+\frac{q}{2}} \int_{n^{\alpha}}^{\infty}u^{-q}\,du \\ \leq&C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha q+\frac{q}{2}}< \infty. \end{aligned}$$

Thus, the proof of Lemma 2.4 is complete. □

3 Main results

Theorem 3.1

Let \(\alpha p> 1\), \(p>0\), and \(\alpha >\frac{1}{2}\). Let \(\{X_{n}, n\geq1\}\) be a sequence of negatively associated random variables with \(EX_{n}=0\) for all \(n\geq1\), and X be a random variable satisfying conditions (1.2) and \(E|X|^{p}<\infty\). Then, for all \(\epsilon>0\),

$$ \sum_{n=1}^{\infty}n^{\alpha p-2-\alpha}E\Biggl( \max_{1\leq k\leq n}\Biggl|\sum_{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}\Biggr)^{+}< \infty, $$
(3.1)

where \(a^{+}=\max\{a, 0\}\).

Proof

To prove (3.1), we use Lemmas 2.3 and 2.4 as follows:

$$\begin{aligned} &\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha}E\Biggl(\max_{1\leq k\leq n}\Biggl|\sum _{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}\Biggr)^{+} \\ &\quad=\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{0}^{\infty}P\Biggl(\Biggl(\max _{1\leq k\leq n}\Biggl|\sum_{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}\Biggr)^{+}>u\Biggr)\,du \\ &\quad=\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{0}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}>u\Biggr)\,du \\ &\quad=\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{0}^{n^{\alpha}} P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}>u\Biggr)\,du \\ &\qquad{} +\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}>u\Biggr)\,du \\ &\quad\leq\sum_{n=1}^{\infty}n^{\alpha p-2} P\Biggl(\max_{1\leq k\leq n}\Biggl|\sum_{i=1}^{k} X_{i}\Biggr|>\epsilon n^{\alpha}\Biggr) \\ &\qquad{} +\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{n^{\alpha}}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|>u\Biggr)\,du. \end{aligned}$$
(3.2)

By Lemma 2.3 the first term of (3.2) is finite, and by Lemma 2.4 the second term of (3.2) is finite. Hence, the proof of the theorem is complete. □

Theorem 3.2

Let \(\alpha p> 1\), \(p>0\), and \(\alpha >\frac{1}{2}\). Let \(\{X_{n}, n\geq1\}\) be a sequence of negatively associated random variables with \(EX_{n}=0\) for all \(n\geq1\), and X be a random variable possibly defined on a different space satisfying the condition (1.2). Then (3.1) implies (2.4).

Proof

It is easy to see that

$$\begin{aligned} &\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha}E\Biggl(\max_{1\leq k\leq n}\Biggl|\sum _{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}\Biggr)^{+} \\ &\quad\geq\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{0}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|-\epsilon n^{\alpha}>u\Biggr)\,du \\ &\quad\geq\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{0}^{\epsilon n^{\alpha}} P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k} X_{i}\Biggr|>\epsilon n^{\alpha}+u\Biggr)\,du \\ &\quad\geq\epsilon\sum_{n=1}^{\infty}n^{\alpha p-2} P\Biggl(\max_{1\leq k\leq n}\Biggl|\sum _{i=1}^{k} X_{i}\Biggr|>2\epsilon n^{\alpha}\Biggr). \end{aligned}$$
(3.3)

Hence, by (3.3), (3.1) implies (2.4). The proof of the theorem is complete. □

Remark

According to Ko [14], the results on the complete moment convergence for negatively associated random variables was established only in the case \(p>1\). In this paper, it is obtained in the case \(p>0\).

Theorem 3.3

Let \(\alpha p>1\), \(p>0\), and \(\alpha>\frac {1}{2}\). Let \(\{X_{n}, n\geq1\}\) be a sequence of negatively associated random variables with \(EX_{n}=0\) for all \(n\geq1\), and X be a random variable satisfying condition (1.2) and \(E|X|^{p}<\infty\). Then, for all \(\epsilon>0\),

$$ \sum_{n=1}^{\infty}n^{\alpha p-2}E\Biggl\{ \sup_{k\geq n}\Biggl|k^{-\alpha}\sum_{j=1}^{k} X_{j}\Biggr|-\epsilon\Biggr\} ^{+}< \infty. $$
(3.4)

Proof

By (3.1) we have

$$\begin{aligned} &\sum_{n=1}^{\infty}n^{\alpha p-2} E\Biggl\{ \sup_{k\geq n}\Biggl|k^{-\alpha}\sum_{i=1}^{k}X_{i}\Biggr|- \epsilon\Biggr\} ^{+} \\ &\quad=\sum_{n=1}^{\infty}n^{\alpha p-2} \int_{0}^{\infty}P\Biggl(\sup_{k\geq n}\Biggl|k^{-\alpha} \sum_{i=1}^{k}X_{i}\Biggr|>\epsilon+u \Biggr)\,du \\ &\quad=\sum_{j=1}^{\infty}\sum _{n=2^{j-1}}^{2^{j}-1}n^{\alpha p-2} \int _{0}^{\infty}P\Biggl(\sup_{k\geq n}\Biggl|k^{-\alpha} \sum_{i=1}^{k}X_{i}\Biggr|>\epsilon+u \Biggr)\,du \\ &\quad\leq C\sum_{j=1}^{\infty}\int_{0}^{\infty}P\Biggl(\sup_{k\geq2^{j-1}}\Biggl|k^{-\alpha } \sum_{i=1}^{k}X_{i}\Biggr|>\epsilon+u \Biggr)\,du \sum_{n=2^{j-1}}^{2^{j}-1}n^{\alpha p-2} \\ &\quad\leq C\sum_{j=1}^{\infty}2^{j(\alpha p-1)} \int_{0}^{\infty}P\Biggl(\sup_{k\geq 2^{j-1}}\Biggl|k^{-\alpha} \sum_{i=1}^{k}X_{i}\Biggr|>\epsilon+u \Biggr)\,du \\ &\quad\leq C\sum_{j=1}^{\infty}2^{j(\alpha p-1)} \sum_{m=j}^{\infty}\int _{0}^{\infty}P\Biggl(\max_{2^{m}-1\leq k\leq2^{m}}\Biggl|k^{-\alpha} \sum_{i=1}^{k}X_{i}\Biggr|>\epsilon+u \Biggr)\,du \\ &\quad\leq C\sum_{m=1}^{\infty}\int_{0}^{\infty}P\Biggl(\max_{2^{m}-1\leq k\geq 2^{m}}\Biggl|k^{-\alpha} \sum_{i=1}^{k}X_{i}\Biggr|>\epsilon+u \Biggr)\,du \sum_{j=1}^{m} 2^{j(\alpha p-1)} \\ &\quad\leq C\sum_{m=1}^{\infty}2^{m(\alpha p-1)} \int_{0}^{\infty}P\Biggl\{ \max_{2^{m}-1\leq k\leq2^{m}}\Biggl| \sum_{i=1}^{k}X_{i}\Biggr|>( \epsilon+u)2^{(m-1)\alpha}\Biggr\} \,du \\ &\qquad \bigl(\mbox{letting} y=2^{(m-1)\alpha}u\bigr) \\ &\quad\leq C\sum_{m=1}^{\infty}2^{m(\alpha p-1-\alpha)} \int_{0}^{\infty}P\Biggl(\max_{1\leq k\leq2^{m}}\Biggl| \sum_{i=1}^{k}X_{i}\Biggr|> \epsilon2^{(m-1)\alpha}+y\Biggr)\,dy \\ &\quad\leq C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha} \int_{0}^{\infty}P\Biggl(\max_{1\leq k\leq n}\Biggl| \sum_{i=1}^{k}X_{i}\Biggr|>\epsilon n^{\alpha}2^{-\alpha}+y\Biggr)\,dy \\ &\quad=C\sum_{n=1}^{\infty}n^{\alpha p-2-\alpha}E \Biggl(\max_{1\leq k\leq n}\Biggl|\sum_{i=1}^{k}X_{i}\Biggr|- \epsilon^{\prime} n^{\alpha}\Biggr)^{+}< \infty, \end{aligned}$$

where \(\epsilon^{\prime}=\epsilon2^{-\alpha}\). Hence, the proof of (3.4) is completed. □

Theorem 3.4

Let \(\alpha>\frac{1}{2}\), \(p>0\), and \(\alpha p>1\). Let \(\{X_{n}, n\geq1\}\) be a sequence of negatively associated random variables with \(EX_{n}=0\) for all \(n\geq1\), and X be a random variable satisfying condition (1.2) with \(E|X|^{p}<\infty\). Then

$$\sum_{n=1}^{\infty}n^{\alpha p-2}P\biggl( \sup_{j\geq n}\biggl|\frac{S_{j}}{j^{\alpha}}\biggr|\geq\epsilon\biggr)< \infty $$

for all \(\epsilon>0\).

Proof

Inspired by the proof of Theorem 12.1 of Gut [15], we can prove it and omit the proof. □