1 Introduction

In this paper we are interested in the complete convergence of a sequence of random variables which satisfies the Rosenthal type inequality. First let us recall some definitions and well-known results.

1.1 Complete convergence

The following concept of complete convergence of a sequence of random variables was introduced first by Hsu and Robbins [3], which plays an important role in limit theory of probability. A random sequence \(\{X_{n}, n \geq1\}\) is said to converge completely to the constant C (write \(X_{n}\rightarrow C\) completely) if for any \(\varepsilon> 0\),

$$\sum_{n=1}^{\infty}P\bigl(\vert X_{n}-C\vert >\varepsilon\bigr)< \infty. $$

From the Borel-Cantelli lemma, this implies that \(X_{n}\rightarrow C\) almost surely (a.s.). For the case of i.i.d. random variables, Hsu and Robbins [3] proved that the sequence of arithmetic means of the random variables converges completely to the expected value if the variance of the summands is finite. Somewhat later, Erdös [4] proved the converse. These results are summarized as follows.

Hsu-Robbins-Erdös strong law. Let \(\{X_{n}, n\ge1\}\) be a sequence of i.i.d. random variables with mean zero and set \(S_{n}=\sum_{i=1}^{n} X_{i}\), \(n\ge1\), then \(E X_{1}^{2}<\infty\) is equivalent to the condition that

$$\sum_{n=1}^{\infty}P\bigl(\vert S_{n}\vert >\varepsilon n\bigr)< \infty\quad \text{for all } \varepsilon>0. $$

The result of Hsu-Robbins-Erdös’ strong law is a fundamental theorem in probability theory and has been intensively investigated in several directions by many authors in the past decades. One of the most important results is Baum and Katz’ [5] strong law of large numbers.

Baum and Katz strong law. Let \(\alpha p\ge1\), \(p> 2\), and let \(\{ X_{n}\}\) be a sequence of i.i.d. random variables and \(E|X_{1}|^{p}<\infty\). If \(\frac{1}{2}<\alpha\le1\), assume that \(E X_{1}=0\). Then

$$\sum_{n=1}^{\infty}n^{\alpha p-2}P \Biggl( \max_{1\le i\le n}\Biggl\vert \sum_{i=1}^{j} X_{i}\Biggr\vert >\varepsilon n^{\alpha} \Biggr)< \infty \quad \text{for all } \varepsilon>0. $$

The Baum and Katz strong law bridges the integrability of summands and the rate of convergence in the Marcinkiewicz-Zygmund strong law of large numbers.

It is well known that the analysis of weighted sums plays an important role in the statistics, such as jackknife estimate, nonparametric regression function estimate and so on. Many authors considered the complete convergence of the weight sums of random variables. Thrum [6] studied the almost sure convergence of weighted sums of i.i.d. random variables; Li et al. [7] obtained complete convergence of weighted sums without the identically distributed assumption. Liang and Su [8] extended the results of Thrum [6], and Li et al. [7] showed the complete convergence of weighted sums of negatively associated sequence. The reader can refer to further literature on complete convergence of weighted sums, such as Xue et al. [9] for the NSD sequence, Gan and Chen [10] for the NOD sequence and so on.

1.2 Rosenthal type inequality

The Rosenthal type inequality is expressed as follows: let \(\{Z_{n}, n \geq1\}\) be a sequence of random variables, for any \(r \geq2\) and every \(n \geq1\), there exists a positive constant C such that

$$ E \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}(Z_{i}-EZ_{i})\Biggr\vert ^{r} \Biggr)\leq C \Biggl[\sum_{i=1}^{n}E|Z_{i}-EZ_{i}|^{r}+ \Biggl(\sum_{i=1}^{n}E(Z_{i}-EZ_{i})^{2} \Biggr)^{\frac{r}{2}} \Biggr]. $$
(1.1)

Many dependent sequences satisfy the Rosenthal type inequality. We refer to Shao [11] for a negatively associated sequence; Utev and Peligrad [12] for a ρ̃-mixing sequence; Shen [13] and Stout [14] for an extended negatively dependent sequence (END); Hu [15] and Wang et al. [16] for a negatively superadditive dependent sequence (NSD); Asadian et al. [17] and Wu [18] for a negatively orthant dependent sequence (NOD); Yuan and An [19] for an asymptotically almost negatively associated sequence (AANA).

The concept of stochastic domination is presented as follows.

Definition 1.1

A sequence \(\{X_{n}, n \geq1\}\) of random variables is said to be stochastically dominated by a random variable X if there exists a positive constant C such that

$$P\bigl(\vert X_{n}\vert >x\bigr)\leq C P\bigl(\vert X\vert >x \bigr) $$

for all \(x\geq0\) and \(n\geq1\).

In the present paper, we shall study the complete convergence of weighted sums of random sequence under the assumption that the random variables satisfy the Rosenthal type inequality. Our main results are stated in Section 2 and the proofs are given in Section 3. Throughout this paper, let C denote a positive constant, which may take different values whenever it appears in different expressions. \(a_{n}=O(b_{n})\) means \(|a_{n}/b_{n}|\leq C\) and \(I(\cdot)\) stands for the indicator function.

2 Main results

Theorem 2.1

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables with zero means, which is stochastically dominated by a random variable X with \(E|X|^{p}<\infty\) for some \(p\geq1\). Let \(\{a_{ni}, 1\leq i\leq n, n\geq1\}\) be an array of real numbers satisfying \(|a_{ni}|\leq C\) for \(1\leq i \leq n\) and \(n\geq1\), where C is a positive constant. Let \(\{b_{n},n \geq1\}\) and \(\{c_{n},n \geq1\}\) be two sequences of positive constants such that, for some \(r\geq\max\{2,p\}\),

$$ \begin{aligned} &\frac{n}{c_{n}^{p}}\rightarrow0 \quad \textit{and} \quad \sum_{n=1}^{k}nb_{n}=O \bigl(c_{k}^{p}\bigr), \\ &\sum_{n=1}^{\infty}\frac{n^{\frac{r}{2}}b_{n}}{c_{n}^{r}}< \infty \quad \textit{and} \quad \sum_{n=k}^{\infty} \frac {nb_{n}}{c_{n}^{r}}=O\bigl(c_{k}^{p-r}\bigr). \end{aligned} $$
(2.1)

Suppose that Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for the above r. Then we have

$$ \sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{i}\Biggr\vert >\varepsilon c_{n} \Biggr)< \infty \quad \textit{for all } \varepsilon>0. $$
(2.2)

Remark 2.1

Under the conditions of Theorem 2.1, if we take \(b_{n}=n^{p\alpha-2}\), \(c_{n}=n^{\alpha}\) for \(1/2<\alpha\leq1\) and \(p\alpha>1\), and suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq n^{\alpha})\) (\(1\leq i\leq n\)) holds for

$$\textstyle\begin{cases} r>\max \{p, \frac{p\alpha-1}{\alpha-2^{-1}} \},& \text{if } p\geq2, \\ r=2,& \text{if } 1< p< 2; \end{cases} $$

then it is easy to see that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}n^{p\alpha-2}P \Biggl(\max _{1\leq j \leq n} \Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon n^{\alpha} \Biggr)< \infty \quad \textit{for all } \varepsilon>0. $$
(2.3)

Remark 2.2

Under the conditions of Theorem 2.1, let \(b_{n}=n^{s-2}\), \(c_{n}=n^{s/p}\) for \(s>p\), \(p>1\), and let the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq n^{s/p})\) (\(1\leq i\leq n\)) hold for

$$\textstyle\begin{cases} r>\max \{p, \frac{1-s}{\frac{1}{2}-\frac{s}{p}} \},& \text{if } p\geq2, \\ r=2,& \text{if } 1< p< 2; \end{cases} $$

then it is clear that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}n^{s-2}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon n^{s/p} \Biggr)< \infty\quad \mbox{for all } \varepsilon>0. $$
(2.4)

Remark 2.3

Under the conditions of Theorem 2.1, if we take \(b_{n}=\frac {\log n}{n}\), \(c_{n}=(n\log n)^{1/p}\) for some \(1\le p\le2\), and suppose that Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq(n\log n)^{1/p})\) (\(1\leq i\leq n\)) holds for

$$\textstyle\begin{cases} r=2,& \text{if } 1\le p < 2, \\ r>4,& \text{if } p=2. \end{cases} $$

It is easy to check that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}\frac{\log n}{n}P \Biggl( \max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon(n\log n)^{1/p} \Biggr)< \infty\quad \mbox{for all } \varepsilon>0. $$
(2.5)

Remark 2.4

Under the conditions of Theorem 2.1, if we take \(b_{n}=\frac {1}{n\log n}\), \(c_{n}=(n\log\log n)^{1/p}\) for some \(1\le p\le2\), and suppose that Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq(n\log\log n)^{1/p})\) (\(1\leq i\leq n\)) holds for

$$\textstyle\begin{cases} r=2,& \text{if } 1\le p < 2, \\ r>2,& \text{if } p=2. \end{cases} $$

It is easy to check that the conditions in (2.1) hold. Hence we have

$$ \sum_{n=1}^{\infty}\frac{1}{n\log n}P \Biggl( \max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon(n\log\log n)^{1/p} \Biggr)< \infty\quad \mbox{for all } \varepsilon>0. $$
(2.6)

Theorem 2.2

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables with zero means and \(\{a_{ni}, 1\leq i\leq n, n\geq1\}\) be an array of real numbers satisfying \(|a_{ni}|\leq C\) for \(1\leq i \leq n\) and \(n\geq1\), where C is a positive constant. Let \(\{c_{n},n \geq1\}\) be sequences of positive constants with \(c_{n}\uparrow\infty\) and \(\{\Psi_{n}(t), n \geq1\}\) be a sequence of nonnegative and even functions such that for each \(n\geq1\), \(\Psi _{n}(t)>0\) as \(t>0\). Suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for \(r=2\). In addition, assume that

$$ \frac{\Psi_{n}(|t|)}{|t|} \uparrow ,\qquad \frac{\Psi _{n}(|t|)}{t^{2}} \downarrow \quad \textit{as } |t| \uparrow $$
(2.7)

and

$$ \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. $$
(2.8)

Then we have

$$ \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon c_{n} \Biggr)< \infty\quad \textit{for all } \varepsilon>0. $$
(2.9)

Theorem 2.3

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables with zero means and \(\{a_{ni}, 1\leq i\leq n, n\geq1\}\) be an array of real numbers satisfying \(|a_{ni}|\leq C\) for \(1\leq i \leq n\) and \(n\geq1\), where C is a positive constant. Let \(\{c_{n},n \geq1\}\) be sequences of positive constants with \(c_{n}\uparrow\infty\) and \(\{\Psi_{n}(t), n \geq1\}\) be a sequence of nonnegative and even functions such that for each \(n\geq1\), \(\Psi_{n}(t)>0\) as \(t>0\). Suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for \(r=2\). In addition, assume that for some \(1\leq p< q\leq2\) and each \(n\geq1\),

$$ \frac{\Psi_{n}(|t|)}{|t|^{p}} \uparrow \quad \textit{and} \quad \frac {\Psi_{n}(|t|)}{t^{q}} \downarrow \quad \textit{as } |t| \uparrow $$
(2.10)

and

$$ \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. $$
(2.11)

Then we have

$$ \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon c_{n} \Biggr)< \infty\quad \textit{for all } \varepsilon>0. $$
(2.12)

Corollary 2.1

Let \({\Psi_{n}(t)}\) be a positive even function satisfying (2.10) for some \(1\leq p< q\) and \(q>2\). Under the conditions in Theorem  2.3, suppose that the Rosenthal type inequality of \(Z_{ni}:=a_{ni}X_{i}I(|X_{i}|\leq c_{n})\) (\(1\leq i\leq n\)) holds for \(r=q\) and

$$ \sum_{n=1}^{\infty}c_{n}^{-r} \Biggl(\sum_{i=1}^{n}EX_{i}^{2} \Biggr)^{r/2}< \infty \quad \textit{for } r= q, $$
(2.13)

we can obtain (2.12).

3 Proofs of main results

In order to prove the main theorems, we need the following lemma which includes the basic properties for stochastic domination. One can refer to Shen [20], Wang et al. [21], Wu [22], or Shen and Wu [23] for the proof.

Lemma 3.1

Let \(\{X_{n}, n\geq1\}\) be a sequence of random variables which is stochastically dominated by a random variable X. Then for any \(\alpha >0 \) and \(b>0\),

$$ E\vert X_{n}\vert ^{\alpha}I\bigl(\vert X_{n}\vert \leq b\bigr)\leq C \bigl[E\vert X\vert ^{\alpha}I \bigl(\vert X\vert \leq b\bigr)+b^{\alpha}P\bigl(\vert X\vert >b \bigr) \bigr] $$
(3.1)

and

$$ E\vert X_{n}\vert ^{\alpha}I\bigl(\vert X_{n}\vert \geq b\bigr)\leq CE\vert X\vert ^{\alpha}I\bigl( \vert X\vert \geq b\bigr). $$
(3.2)

Consequently, \(E|X_{n}|^{\alpha}\leq C E|X|^{\alpha}\).

Proof of Theorem 2.1

For \(1\leq i\leq n\) and \(n\geq1\), denote \(X_{ni}'=X_{i}I(|X_{i}|\leq c_{n})\). Noting that \(EX_{i}=0\) and by the conditions \(nc_{n}^{-p}\to0\) and \(|a_{ni}|\leq C\), we have

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \\& \quad = c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\Biggr\vert \\& \quad \leq c_{n}^{-1}\max_{1\leq j \leq n}\sum _{i=1}^{j} \bigl\vert a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\bigr\vert \\& \quad \leq c_{n}^{-1}\sum_{i=1}^{n} \vert a_{ni}\vert E\vert X_{i}\vert I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C c_{n}^{-1}\sum_{i=1}^{n} \vert a_{ni}\vert E\vert X\vert I\bigl(\vert X\vert >c_{n}\bigr) \\& \quad \leq C n c_{n}^{-1}E\vert X\vert I\bigl(\vert X \vert >c_{n}\bigr) \\& \quad \leq C n c_{n}^{-p}E\vert X\vert ^{p} \rightarrow0\quad \mbox{as } n\rightarrow \infty. \end{aligned}$$

Hence for any \(\varepsilon>0\), it follows that for all n large enough

$$ c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert < \frac{\varepsilon}{2}. $$
(3.3)

From (3.3), it is easy to see that

$$\begin{aligned}& \sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{i}\Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}b_{n} \sum_{i=1}^{n}P\bigl(\vert X_{i} \vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{ni}' \Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}nb_{n}P \bigl(\vert X\vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}b_{n}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad =: CI+CJ. \end{aligned}$$
(3.4)

In order to prove (2.2), it suffices to prove that \(I<\infty\) and \(J<\infty\). First, for I, by the condition (2.1), it is easy to check that

$$\begin{aligned} I =& \sum_{n=1}^{\infty}nb_{n}P \bigl(\vert X\vert >c_{n}\bigr) \\ \leq& C\sum_{n=1}^{\infty}nb_{n}\sum _{k=n}^{\infty}P \bigl(c_{k}< \vert X \vert \leq c_{k+1} \bigr) \\ \leq& C\sum_{k=1}^{\infty}P \bigl(c_{k}< \vert X\vert \leq c_{k+1} \bigr)\sum _{n=1}^{k}nb_{n} \\ \leq& C\sum_{k=1}^{\infty}c_{k}^{p}P \bigl(c_{k}< \vert X\vert \leq c_{k+1} \bigr) \\ \leq& CE|X|^{p}< \infty. \end{aligned}$$
(3.5)

Second, we will show \(J<\infty\). It follows by the Markov inequality and the Rosenthal type inequality that, for \(r\geq2\),

$$\begin{aligned} J \leq& \biggl(\frac{2}{\varepsilon}\biggr)^{r}\sum _{n=1}^{\infty }b_{n}c_{n}^{-r}E \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}\bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert ^{r}\Biggr) \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl\{ \Biggl(\sum_{i=1}^{n}a_{ni}^{2}E \bigl(X_{ni}'-EX_{ni}' \bigr)^{2}\Biggr)^{\frac{r}{2}} +\sum_{i=1}^{n}a_{ni}^{r}E \bigl\vert X_{ni}'-EX_{ni}'\bigr\vert ^{r}\Biggr\} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl\{ \Biggl(\sum_{i=1}^{n}a_{ni}^{2}E \bigl(X_{ni}'\bigr)^{2}\Biggr)^{\frac{r}{2}} + \sum_{i=1}^{n}a_{ni}^{r}E \bigl\vert X_{ni}'\bigr\vert ^{r}\Biggr\} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl(\sum_{i=1}^{n}E \vert X_{i}\vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr)\Biggr)^{\frac{r}{2}} \\ &{}+C\sum _{n=1}^{\infty}b_{n}c_{n}^{-r} \sum_{i=1}^{n}E \vert X_{i} \vert ^{r}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ =:& CJ_{1}+CJ_{2}. \end{aligned}$$
(3.6)

For the case \(p \geq2\), from Lemma 3.1, Markov’s inequality, and the condition (2.1), we have

$$\begin{aligned} J_{1} \leq &C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl(\sum_{i=1}^{n} \bigl(E\vert X \vert ^{2}I\bigl(\vert X\vert \leq c_{n} \bigr)+c_{n}^{2}P\bigl(\vert X\vert > c_{n} \bigr) \bigr) \Biggr)^{\frac{r}{2}} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \Biggl(\sum_{i=1}^{n} \bigl(E\vert X \vert ^{2}I\bigl(\vert X\vert \leq c_{n}\bigr)+E\vert X\vert ^{2}I\bigl(\vert X\vert > c_{n}\bigr) \bigr) \Biggr)^{\frac {r}{2}} \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r}n^{\frac{r}{2}}< \infty. \end{aligned}$$
(3.7)

Since \(r\ge p\), it follows by Lemma 3.1 again, (3.5), and the condition (2.1) that

$$\begin{aligned} J_{2} =& \sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \sum_{i=1}^{n}E \vert X_{i} \vert ^{r}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-r} \sum_{i=1}^{n} \bigl[E\vert X\vert ^{r}I\bigl(\vert X\vert \leq c_{n}\bigr)+c_{n}^{r}P \bigl(\vert X\vert > c_{n}\bigr)\bigr] \\ =& C\sum_{n=1}^{\infty}nb_{n}c_{n}^{-r}E \vert X\vert ^{r}I\bigl(\vert X\vert \leq c_{n} \bigr)+C\sum_{n=1}^{\infty}nb_{n}P\bigl( \vert X\vert > c_{n}\bigr) \\ \leq& C\sum_{n=1}^{\infty}nb_{n}c_{n}^{-r} \sum_{k=1}^{n}E\vert X\vert ^{r}I\bigl(c_{k-1}< \vert X\vert \leq c_{k} \bigr)+C \\ =& C\sum_{k=1}^{\infty}E \vert X\vert ^{r}I\bigl(c_{k-1}< \vert X\vert \leq c_{k} \bigr)\sum_{n=k}^{\infty}nb_{n}c_{n}^{-r}+C \\ \leq& C\sum_{k=1}^{\infty}E \vert X\vert ^{p}I\bigl(c_{k-1}< \vert X\vert \leq c_{k} \bigr)+C \\ \leq& CE\vert X\vert ^{p}+C< \infty. \end{aligned}$$
(3.8)

For the case \(1\le p<2\), we have \(r\ge2\) and we can take \(r=2\) in the Rosenthal type inequality. From a proof similar to (3.8), we get

$$\begin{aligned} J \leq& C\sum_{n=1}^{\infty}b_{n}c_{n}^{-2} \sum_{i=1}^{n}E \vert X_{i} \vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ \le&CE|X|^{p}+C< \infty. \end{aligned}$$

Hence from the above discussions the claim (2.2) holds. □

Proof of Theorem 2.2

For \(1\leq i\leq n\) and \(n\geq1\), define \(X_{ni}'=X_{i}I(|X_{i}|\leq c_{n})\). From the conditions (2.7), \(EX_{i}=0\), and \(|a_{ni}|\leq C\), we have

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \\& \quad = c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\Biggr\vert \\& \quad \leq c_{n}^{-1}\max_{1\leq j \leq n}\sum _{i=1}^{j} \bigl\vert a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\bigr\vert \\& \quad \leq \sum_{i=1}^{n}c_{n}^{-1} \vert a_{ni}\vert E\vert X_{i}\vert I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(\vert X_{i}\vert )}{\Psi _{i}(c_{n})}I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}\rightarrow0 \quad \mbox{as } n\rightarrow\infty. \end{aligned}$$

Hence, for any \(\varepsilon>0\), we have for all n large enough,

$$ c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert < \frac{\varepsilon}{2}. $$
(3.9)

From (3.9), it follows that

$$\begin{aligned}& \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}X_{i} \Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}X_{ni}' \Biggr\vert >\varepsilon c_{n} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr)+C\sum_{n=1}^{\infty}P \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad =: CI+CJ. \end{aligned}$$
(3.10)

Now it suffices to control the terms I and J. For the term I, by the condition (2.8), we can get

$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(|X_{i}|)}{\Psi_{i}(c_{n})} \\& \quad = \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. \end{aligned}$$
(3.11)

For the term J, using the Markov inequality and the Rosenthal type inequality for \(r=2\) and the condition (2.7), (2.8), we have

$$\begin{aligned} J \leq& \biggl(\frac{2}{\varepsilon}\biggr)^{2}\sum _{n=1}^{\infty }c_{n}^{-2}E \Biggl(\max_{1\leq j \leq n}\Biggl\vert \sum _{i=1}^{j}a_{ni}\bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert ^{2}\Biggr) \\ \leq& C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}a_{ni}^{2}E \bigl(X_{ni}'-EX_{ni}' \bigr)^{2} \\ \leq& C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}\vert a_{ni}\vert ^{2}E \bigl\vert X_{ni}'\bigr\vert ^{2} \\ \leq& C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}E \vert X_{i} \vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\ \leq& C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(\vert X_{i}\vert )}{\Psi_{i}(c_{n})}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\ \leq& C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(X_{i})}{\Psi_{i}(c_{n})}< \infty. \end{aligned}$$
(3.12)

Hence the proof of Theorem 2.2 is completed. □

Proof of Theorem 2.3

For \(1\leq i\leq n\) and \(n\geq1\), define \(X_{ni}'=X_{i}I(|X_{i}|\leq c_{n})\). Similar to the proof of Theorem 2.2, it suffices to show that, for any \(\varepsilon>0\),

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \rightarrow0 \quad \mbox{as } n\rightarrow \infty, \end{aligned}$$
(3.13)
$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr)< \infty, \end{aligned}$$
(3.14)

and

$$ \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr)< \infty. $$
(3.15)

First, it follows from the conditions (2.10), (2.11), \(EX_{i}=0\), and \(|a_{ni}|\leq C\) that

$$\begin{aligned}& c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{ni}' \Biggr\vert \\& \quad = c_{n}^{-1}\max_{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\Biggr\vert \\& \quad \leq c_{n}^{-1}\max_{1\leq j \leq n}\sum _{i=1}^{j} \bigl\vert a_{ni}EX_{i}I \bigl(\vert X_{i}\vert >c_{n}\bigr)\bigr\vert \\& \quad \leq \sum_{i=1}^{n}c_{n}^{-1}|a_{ni}|E \vert X_{i}\vert I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n} \frac{E\vert X_{i}\vert ^{p}I(\vert X_{i}\vert >c_{n})}{c_{n}^{p}} \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(\vert X_{i}\vert )}{\Psi _{i}(c_{n})}I\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq C\sum_{i=1}^{n}E \frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}\rightarrow0 \quad \mbox{as } n\rightarrow\infty. \end{aligned}$$
(3.16)

Second, by the condition (2.10), we know \(\Psi_{n}(t)\) is an increasing function as \(t>0\). Therefore, by the condition (2.11)

$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{i=1}^{n}P\bigl(\vert X_{i}\vert >c_{n}\bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(|X_{i}|)}{\Psi_{i}(c_{n})} \\& \quad = \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi_{i}(X_{i})}{\Psi _{i}(c_{n})}< \infty. \end{aligned}$$
(3.17)

Finally, for \(1\leq p < q\leq2\), by the Markov inequality and the Rosenthal type inequality for \(r= 2\) and the conditions (2.10), (2.11), we have

$$\begin{aligned}& \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}c_{n}^{-2} \sum_{i=1}^{n}E \vert X_{i} \vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}\frac{E\vert X_{i} \vert ^{q}I(\vert X_{i}\vert \leq c_{n})}{c_{n}^{q}} \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(\vert X_{i}\vert )}{\Psi_{i}(c_{n})}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\& \quad \leq \sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(X_{i})}{\Psi_{i}(c_{n})}< \infty. \end{aligned}$$
(3.18)

The proof of this theorem is completed. □

Proof of Corollary 2.1

From (3.16), (3.17) in the proof of Theorem 2.3, and the condition (2.10) holding for some \(1\leq p< q\) and \(q>2\), we only need to show (3.15) holds. By Markov’s inequality, the Rosenthal type inequality and the conditions (2.10), (2.11), (2.13), we have, for \(r=q>2\),

$$\begin{aligned}& \sum_{n=1}^{\infty}P \Biggl(\max _{1\leq j \leq n}\Biggl\vert \sum_{i=1}^{j}a_{ni} \bigl(X_{ni}'-EX_{ni}' \bigr)\Biggr\vert >\frac {\varepsilon c_{n}}{2} \Biggr) \\& \quad \leq C\sum_{n=1}^{\infty}c_{n}^{-r} \Biggl(\sum_{i=1}^{n}E \vert X_{i}\vert ^{2}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \Biggr)^{\frac{r}{2}} +C\sum _{n=1}^{\infty}c_{n}^{-r}\sum _{i=1}^{n}E\vert X_{i} \vert ^{r}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\& \quad \leq C+C\sum_{n=1}^{\infty}c_{n}^{-q} \sum_{i=1}^{n}E \vert X_{i} \vert ^{q}I\bigl(\vert X_{i}\vert \leq c_{n} \bigr) \\& \quad \leq C+C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(\vert X_{i}\vert )}{\Psi_{i}(c_{n})}I\bigl(\vert X_{i}\vert \leq c_{n}\bigr) \\& \quad \leq C+C\sum_{n=1}^{\infty}\sum _{i=1}^{n}E\frac{\Psi _{i}(X_{i})}{\Psi_{i}(c_{n})}< \infty, \end{aligned}$$
(3.19)

which completes the proof. □

4 Conclusions

The present work is meant to establish some new results as regards complete convergence for weighted sums of random variables which satisfy the Rosenthal type inequality. These results extend some known results.