1 Introduction

First of all, we will restate some definitions of dependent structures.

Definition 1.1

A finite family of random variables \(\{ {{X}_{i}},1\le i\le n \}\) is called negatively associated (NA) if for any disjoint subsets A and B of \(\{ 1,2,\ldots,n \}\), and any real coordinatewise non-decreasing functions \({{f}_{1}}\) on \({{\mathbb{R}}^{A}}\) and \({{f}_{2}}\) on \({{\mathbb{R}}^{B}}\),

$$ \operatorname{Cov} \bigl( {{f}_{1}} ( {{X}_{i}},i\in A ),{{f}_{2}} ( {{X}_{j}},j\in B ) \bigr)\le0, $$
(1.1)

whenever this covariance exists. An infinite family of random variables \(\{ {{X}_{n}},n\ge1 \}\) is NA if every finite subfamily is NA.

For two nonempty disjoint sets S and T of real numbers, let \(\sigma ( S )\) and \(\sigma ( T )\) be the σ-fields separately generated by \(\{ {{X}_{i}},i\in S \}\) and \(\{ {{X}_{i}},i\in T \}\). Let \(\operatorname{dist} (S,T )=\min \{\vert j-k \vert ,j\in S,k\in T \}\).

Definition 1.2

A sequence of random variables \(\{ {{X}_{n}},n\ge1 \}\) is called ρ̃ (or \({{\rho }^{*}}\))-mixing if

$$ \tilde{\rho} ( s )=\sup \bigl\{ \rho ( S,T ):S,T\subset \mathrm{N}, \operatorname{dist} ( S,T )\ge s \bigr\} \to0\quad \text{as } s\to\infty, $$
(1.2)

where

$$ \rho ( S,T )=\sup \biggl\{ \frac{\vert EXY-EXEY \vert }{\sqrt{\operatorname{Var}X}\cdot\sqrt{\operatorname{Var}Y}};X\in {{L}_{2}} \bigl( \sigma(S) \bigr),Y\in{{L}_{2}} \bigl( \sigma(T) \bigr) \biggr\} . $$

Definition 1.3

A sequence of random variables \(\{ {{X}_{n}},n\ge1 \}\) is said to be ANA if

$$ {{\rho}^{-}} ( s )=\sup \bigl\{ {{\rho}^{-}} ( S,T ):S,T\subset \mathrm{N},\operatorname{dist} ( S,T )\ge s \bigr\} \to 0 \quad \text{as }s\to\infty, $$
(1.3)

where

$$ {{\rho}^{-}} ( S,T )=0\vee\sup \biggl\{ \frac{\operatorname {Cov} ( {{f}_{1}} ( {{X}_{i}},i\in S ),{{f}_{2}} ( {{X}_{j}},j\in T ) )}{{{ ( \operatorname{Var}{{f}_{1}} ( {{X}_{i}},i\in S ) )}^{1/2}}{{ (\operatorname {Var}{{f}_{2}} ( {{X}_{j}},j\in T ) )}^{1/2}}} \biggr\} , $$

where the supremum is taken over all coordinatewise non-decreasing functions \({{f}_{1}}\) on \({{\mathbb{R}}^{S}}\) and \({{f}_{2}}\) on \({{\mathbb{R}}^{T}}\).

An array of random variables \(\{ {{X}_{ni}},1\leq i\leq n,n\ge1 \}\) is called rowwise ANA random variables if for every \(n\ge1\), \(\{ {{X}_{ni}},1\leq i\leq n \}\) is a sequence of ANA random variables.

The concept of NA was introduced by Joag-Dev and Proschan [1], the concept of ρ̃-mixing was introduced by Bradley [2], and the concept of ANA was introduced by Zhang and Wang [3]. It is easily seen that \({{\rho}^{-}} ( s )\le\tilde{\rho} ( s )\), and a sequence of ANA random variables is NA if and only if \({{\rho }^{-}} ( 1 )=0\). Hence, sequences of ANA random variables are a family of very wide scope, which contain NA random variable sequences and ρ̃-mixing random variable sequences.

Since the notion of ANA random variables was introduced, many applications have been found. We can refer the reader to [313], and so forth.

The concept of complete convergence was first given by Hsu and Robbins [14]. A sequence of random variables \(\{ {{X}_{n}},n\ge1 \} \) is said to converge completely to a constant λ if for all \(\varepsilon>0\),

$$ \sum_{n=1}^{\infty}{P \bigl( \vert {{X}_{n}}-\lambda \vert >\varepsilon \bigr)}< \infty. $$

In view of the Borel-Cantelli lemma, the above result implies that \({{X}_{n}}\to\lambda\) almost surely. Therefore, the notion of complete convergence is a very important tool in establishing almost sure convergence of summation of random variables.

Let \(\{ {{X}_{n}},n\ge1 \}\) be a sequence of random variables and \({{a}_{n}}>0\), \({{b}_{n}}>0\), \(q>0\). If for all \(\varepsilon\ge0\),

$$ \sum_{n=1}^{\infty}{{{a}_{n}}E \bigl( b_{n}^{-1}\vert {{X}_{n}} \vert -\varepsilon \bigr)_{+}^{q}}< \infty, $$

then the above result was called the complete moment convergence by Chow [15].

Let \(\{ {{X}_{ni}},1\le i\le n,n\ge1 \}\) be an array of rowwise NA random variables, and let \(\{ {{a}_{n}},n\ge1 \} \) be a sequence of positive real numbers with \({{a}_{n}}\uparrow\infty \). Let \(\{ {{\psi}_{n}} ( t ),n\ge1 \}\) be a sequence of positive, even functions such that

$$ \frac{{{\psi}_{n}} ( \vert t \vert )}{\vert t \vert }\uparrow\quad \text{and}\quad \frac{{{\psi}_{n}} ( \vert t \vert )}{{{\vert t \vert }^{p}}} \downarrow \quad \text{as }\vert t \vert \uparrow $$
(1.4)

for some nonnegative integer p. Introduce the following conditions:

$$\begin{aligned}& E{{X}_{ni}}=0,\quad 1\le i\le n, n\ge1, \end{aligned}$$
(1.5)
$$\begin{aligned}& \sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( \vert {{X}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}}< \infty, \end{aligned}$$
(1.6)
$$\begin{aligned}& \sum_{n=1}^{\infty}{{{ \Biggl( \sum_{i=1}^{n}{E{{ \biggl( \frac{\vert {{X}_{ni}} \vert }{{{a}_{n}}} \biggr)}^{r}}} \Biggr)}^{s}}}< \infty, \end{aligned}$$
(1.7)

where \(0< r\le2\) and \(s>0\).

Gan and Chen [16] showed the following complete convergence theorems for NA cases.

Theorem A

Let \(\{ {{X}_{ni}},1\le i\le n,n\ge1 \}\) be an array of rowwise NA random variables, and let \(\{ {{\psi}_{n}} ( t ),n\ge1 \}\) satisfy (1.4) for some integer \(1< p\le2\). Then (1.5) and (1.6) imply

$$ \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert \to0 \quad \textit{completely}. $$
(1.8)

Theorem B

Let \(\{ {{X}_{ni}},1\le i\le n,n\ge1 \}\) be an array of rowwise NA random variables, and let \(\{ {{\psi}_{n}} ( t ),n\ge1 \}\) satisfy (1.4) for some integer \(p>2\). Then (1.5), (1.6), and (1.7) imply (1.8).

Zhu [17] obtained the corresponding result for \({{\rho}^{*}}\)-mixing cases.

Theorem C

Let \(\{ {{X}_{ni}},1\le i\le n,n\ge1 \}\) be an array of rowwise \({{\rho}^{*}}\)- mixing random variables, and let \(\psi ( t )\) be a positive, even function satisfying (1.4) for some integer \(p\ge2\). Then (1.5), (1.6), and

$$ \sum_{n=1}^{\infty}{{{ \Biggl( \sum_{i=1}^{n}{E{{ \biggl( \frac{{{X}_{ni}}}{{{a}_{n}}} \biggr)}^{2}}} \Biggr)}^{v/2}}}< \infty \quad \textit{for } v \ge p, $$
(1.9)

imply (1.8).

Inspired by the above obtained theorems, in this work, we will not only extend Theorems A, B, and C to ANA random variables, but also one obtains some much stronger conclusions under some more general conditions. The goal of this paper is to study complete convergence, complete moment convergence, and mean convergence for arrays of rowwise ANA random variables.

Throughout this paper, let \(I ( A )\) be the indicator function of the set A. The symbol C always stands for a generic positive constant, which may vary from one place to another, and \({{a}_{n}}=O ( {{b}_{n}} )\) stands for \({{a}_{n}}\le C{{b}_{n}}\).

2 Main results

Now, the main results are presented in this section. The proofs will be given in the next section.

Theorem 2.1

Let N be a positive integer, \(M\ge2\) and \(0\le s<{{ ( {1}/{6M} )}^{M/2}}\). Let \(\{ {{X}_{ni}},1\le i\le n,n\ge1 \}\) be an array of rowwise ANA random variables with \({{\rho}^{-}} ( \mathrm{N} )\le s\) in each row, and let \(\{ {{a}_{n}},n\ge1 \}\) be a sequence of positive real numbers with \({{a}_{n}}\uparrow\infty\). Let \(\{ {{\psi}_{n}} ( t ),n\ge1 \}\) be a sequence of positive, even functions such that

$$ \frac{{{\psi}_{n}} ( \vert t \vert )}{{{\vert t \vert }^{q}}}\uparrow\quad \textit{and} \quad \frac{{{\psi}_{n}} ( \vert t \vert )}{{{\vert t \vert }^{p}}}\downarrow\quad \textit{as } \vert t \vert \uparrow $$
(2.1)

for some \(1\le q< p\).

  1. (1)

    If \(1< p\le2\), then conditions (1.5) and (1.6) imply

    $$ \sum_{n=1}^{\infty}{P \Biggl( \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert > \varepsilon \Biggr)< \infty} \quad \textit{for all }\varepsilon>0. $$
    (2.2)
  2. (2)

    If \(p>2\), then conditions (1.5), (1.6), and (1.9) imply (2.2).

Theorem 2.2

Let N be a positive integer, \(M\ge2\) and \(0\le s<{{ ( {1}/{6M} )}^{M/2}}\). Let \(\{ {{X}_{ni}},1\le i\le n,n\ge1 \}\) be an array of rowwise ANA random variables with \({{\rho}^{-}} ( \mathrm{N} )\le s\) in each row, and let \(\{ {{a}_{n}},n\ge1 \}\) be a sequence of positive real numbers with \({{a}_{n}}\uparrow\infty\). Let \(\{ {{\psi}_{n}} ( t ),n\ge1 \}\) be a sequence of positive, even functions satisfying (2.1) for some \(1\le q< p\).

  1. (1)

    If \(1< p\le2\), then conditions (1.5) and (1.6) imply

    $$ \sum_{n=1}^{\infty}{a_{n}^{-q}}E \Biggl( \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert - \varepsilon{{a}_{n}} \Biggr)_{+}^{q}< \infty \quad \textit{for all } \varepsilon>0. $$
    (2.3)
  2. (2)

    If \(p>2\), then conditions (1.5), (1.6), and (1.9) imply (2.3).

Theorem 2.3

Let N be a positive integer, \(M\ge2\) and \(0\le s<{{ ( {1}/{6M} )}^{M/2}}\). Let \(\{ {{X}_{ni}},1\le i\le n,n\ge1 \}\) be an array of rowwise ANA random variables with \({{\rho}^{-}} ( \mathrm{N} )\le s\) in each row, and let \(\{ {{a}_{n}},n\ge1 \}\) be a sequence of positive real numbers with \({{a}_{n}}\uparrow\infty\). Let \(\{ {{\psi}_{n}} ( t ),n\ge1 \}\) be a sequence of positive, even functions satisfying (2.1) for some \(1\le q< p\).

  1. (1)

    If \(1< p\le2\), then condition (1.5) and

    $$ \sum_{i=1}^{n}{ \frac{E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}\to0 \quad \textit{as }n\to \infty $$
    (2.4)

    imply

    $$ \lim_{n\to\infty} E{{ \Biggl( \frac {1}{{{a}_{n}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert \Biggr)}^{q}}=0. $$
    (2.5)
  2. (2)

    If \(p>2\), then conditions (1.5), (2.4), and

    $$ \sum_{i=1}^{n}{ \frac{E{{\vert {{X}_{ni}} \vert }^{2}}I ( \vert {{X}_{ni}} \vert \le{{a}_{n}} )}{a_{n}^{2}}}\to0\quad \textit{as }n\to\infty $$
    (2.6)

    imply (2.5).

Remark 2.1

Since NA random variables and ρ̃- mixing random variables are two special cases of ANA random variables, Theorem 2.1 is an extension and improvement of Theorems A and B for NA random variables, Theorem C for ρ̃-mixing random variables. In addition, in this work, we consider the case \(1\le q< p\), which has a wider scope than the case \(q=1\) in Gan and Chen [16] and Zhu [17].

Remark 2.2

Under the conditions of Theorem 2.2, one has

$$\begin{aligned} \infty >&\sum_{n=1}^{\infty}{a_{n}^{-q}}E \Biggl( \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert - \varepsilon{{a}_{n}} \Biggr)_{+}^{q} \\ =&\sum_{n=1}^{\infty}{a_{n}^{-q}} \int_{0}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert -\varepsilon{{a}_{n}}>{{t}^{1/q}} \Biggr)\, d}t \\ \ge& C\sum_{n=1}^{\infty}{a_{n}^{-q}} \int_{0}^{{{\varepsilon }^{q}}a_{n}^{q}}{P \Biggl(\max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{a}_{n}}\varepsilon +{{t}^{1/q}} \Biggr)}\, dt \\ \ge& C\sum_{n=1}^{\infty}{a_{n}^{-q}} \int_{0}^{{{\varepsilon }^{q}}a_{n}^{q}}{P \Biggl(\max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{a}_{n}}\varepsilon +{{a}_{n}}\varepsilon \Biggr)}\, dt \\ \ge& C{{\varepsilon}^{q}}\sum_{n=1}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >2 \varepsilon{{a}_{n}} \Biggr)}\quad \text{for all } \varepsilon>0. \end{aligned}$$
(2.7)

Hence, from (2.7), one can clearly know that the complete moment convergence implies the complete convergence. Compared with the corresponding results of Gan and Chen [16] and Zhu [17], it is worth pointing out that Theorem 2.2 is much stronger and conditions are more general and much weaker.

3 Proofs

To prove the main results, the following lemmas are needed.

Lemma 3.1

Wang and Lu [7]

Let \(\{{{X}_{n}},n\ge1 \}\) be a sequence of ANA random variables, and let \(\{ {{f}_{n}},n\ge1 \}\) be a sequence of real functions all of which are monotone non-decreasing (or all monotone non-increasing), then \(\{ {{f}_{n}} ( {{X}_{n}} ),n\ge1 \}\) is still a sequence of ANA random variables.

Lemma 3.2

Wang and Lu [7]

For a positive integer \(\mathrm{N}\ge1\), real numbers \(M\ge2\) and \(0\le s<{{ ( \frac {1}{6M} )}^{M/2}}\), let \(\{ {{X}_{n}},n\ge1 \}\) be a sequence of ANA random variables with \({{\rho}^{-}} ( \mathrm{N} )\le s\), \(E{{X}_{n}}=0\) and \(E{{\vert {{X}_{n}} \vert }^{M}}<\infty\) for every \(n\ge1\). Then there exists a positive constant \(C=C ( M,\mathrm{N},s )\) such that

$$ E \Biggl(\max_{1\le j\le n} {{\Biggl\vert \sum _{i=1}^{j}{{{X}_{i}}} \Biggr\vert }^{M}} \Biggr)\le C \Biggl( \sum_{i=1}^{n}{E{{ \vert {{X}_{i}} \vert }^{M}}}+{{ \Biggl( \sum _{i=1}^{n}{EX_{i}^{2}} \Biggr)}^{M/2}} \Biggr). $$
(3.1)

In particular, for \(M=2\),

$$ E \Biggl(\max_{1\le j\le n} {{\Biggl\vert \sum _{i=1}^{j}{{{X}_{i}}} \Biggr\vert }^{2}} \Biggr)\le C\sum_{i=1}^{n}{EX_{i}^{2}}. $$

Proof of Theorem 2.1

For any \(1\le i\le n\), \(n\ge1\), define

$$\begin{aligned}& {{Y}_{ni}}=-{{a}_{n}}I ( {{X}_{ni}}< -{{a}_{n}} )+{{X}_{ni}}I \bigl( \vert {{X}_{ni}} \vert \le{{a}_{n}} \bigr) +{{a}_{n}}I ( {{X}_{ni}}>{{a}_{n}} ), \\& {{Z}_{ni}}={{X}_{ni}}-{{Y}_{ni}}= ( {{X}_{ni}}+{{a}_{n}} )I ( {{X}_{ni}}< -{{a}_{n}} )+ ( {{X}_{ni}}-{{a}_{n}} )I ( {{X}_{ni}}>{{a}_{n}} ). \end{aligned}$$

It is easy to check that, for all \(\varepsilon>0\),

$$\begin{aligned} P \Biggl( \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >\varepsilon \Biggr) \le& P \Biggl( \frac{1}{{{a}_{n}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{Y}_{ni}}} \Biggr\vert >\varepsilon \Biggr)+P \Bigl(\max_{1\le i\le n} \vert {{X}_{ni}} \vert >{{a}_{n}} \Bigr) \\ =&P \Biggl( \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert >\varepsilon-\frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{E{{Y}_{ni}}} \Biggr\vert \Biggr) \\ &{}+P \Bigl(\max_{1\le i\le n} \vert {{X}_{ni}} \vert >{{a}_{n}} \Bigr). \end{aligned}$$
(3.2)

First of all, we will show that

$$ \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{E{{Y}_{ni}}} \Biggr\vert \to0 \quad \text{as }n\to\infty. $$
(3.3)

For \(1\le i\le n\), \(n\ge1\), \(E{{X}_{ni}}=0\), then \(E{{Y}_{ni}}=-E{{Z}_{ni}}\). If \({{X}_{ni}}>{{a}_{n}}\), \(0<{{Z}_{ni}}={{X}_{ni}}-{{a}_{n}}<{{X}_{ni}}\). If \({{X}_{ni}}<-{{a}_{n}}\), \({{X}_{ni}}<{{Z}_{ni}}={{X}_{ni}}+{{a}_{n}}\le 0\). So, \(\vert {{Z}_{ni}} \vert \le \vert {{X}_{ni}} \vert I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )\). Then from conditions (2.1) and (1.6), one has

$$\begin{aligned} \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{E{{Y}_{ni}}} \Biggr\vert =&\frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{E{{Z}_{ni}}} \Biggr\vert \\ \le& \frac{1}{{{a}_{n}}}\sum_{i=1}^{n}{E \vert {{Z}_{ni}} \vert } \\ \le& C\sum_{i=1}^{n}{\frac{E\vert {{X}_{ni}} \vert I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{{{a}_{n}}}} \\ \le& C\sum_{i=1}^{n}{\frac{E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}} \to0\quad \text{as }n\to\infty. \end{aligned}$$
(3.4)

Hence, for n large enough,

$$\begin{aligned} P \Biggl( \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >\varepsilon \Biggr) \le& P \Biggl( \frac{1}{{{a}_{n}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert >\frac{\varepsilon}{2} \Biggr) \\ &{}+P \Bigl(\max_{1\le i\le n} \vert {{X}_{ni}} \vert >{{a}_{n}} \Bigr). \end{aligned}$$

To prove (2.2), it suffices to show that

$$\begin{aligned}& {{I}_{1}}\triangleq\sum_{n=1}^{\infty}{P \Biggl( \frac {1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert >\frac {\varepsilon}{2} \Biggr)}< \infty, \end{aligned}$$
(3.5)
$$\begin{aligned}& {{I}_{2}}\triangleq\sum_{n=1}^{\infty}{P \Bigl(\max_{1\le i\le n} \vert {{X}_{ni}} \vert >{{a}_{n}} \Bigr)}< \infty. \end{aligned}$$
(3.6)

By Lemma 3.1, it obviously follows that \(\{ {{Y}_{ni}}-E{{Y}_{ni}},1\le i\le n,n\ge1 \}\) is still an array of rowwise ANA random variables with zero mean. For \({{I}_{1}}\), note that \(\vert {{Y}_{ni}} \vert \le{{a}_{n}}\).

(1) If \(1\le q< p\le2\), by the Markov inequality, Lemma 3.2 (for \(M=2\)), (2.1), and (1.6), one has

$$\begin{aligned} {{I}_{1}} \le& C\sum_{n=1}^{\infty}{ \frac{1}{a_{n}^{2}}E \Biggl( \max_{1\le j\le n} {{\Biggl\vert \sum _{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert }^{2}} \Biggr)} \\ \le& C\sum_{n=1}^{\infty}{\frac{1}{a_{n}^{2}} \sum_{i=1}^{n}{E{{\vert {{Y}_{ni}}-E{{Y}_{ni}} \vert }^{2}}}} \\ \le& C\sum_{n=1}^{\infty}{\frac{1}{a_{n}^{2}} \sum_{i=1}^{n}{EY_{ni}^{2}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{ \vert {{Y}_{ni}} \vert }^{p}}}{a_{n}^{p}}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( \vert {{Y}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( \vert {{X}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}}< \infty. \end{aligned}$$
(3.7)

(2) If \(1\le q< p\) and \(p>2\), by the Markov inequality, Lemma 3.2 (for \(M>p>2\)), (2.1), (1.6), and (1.9), one also has

$$\begin{aligned} {{I}_{1}} \le& C\sum_{n=1}^{\infty}{ \frac{1}{a_{n}^{M}}E \Biggl( \max_{1\le j\le n} {{\Biggl\vert \sum _{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert }^{M}} \Biggr)} \\ \le& C\sum_{n=1}^{\infty}{\frac{1}{a_{n}^{M}} \Biggl( \sum_{i=1}^{n}{E{{\vert {{Y}_{ni}}-E{{Y}_{ni}} \vert }^{M}}}+{{ \Biggl( \sum_{i=1}^{n}{E{{\vert {{Y}_{ni}}-E{{Y}_{ni}} \vert }^{2}}} \Biggr)}^{M/2}} \Biggr)} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{ \vert {{Y}_{ni}} \vert }^{M}}}{a_{n}^{M}}}}+C\sum _{n=1}^{\infty }{{{ \Biggl( \sum _{i=1}^{n}{\frac{E{{\vert {{Y}_{ni}} \vert }^{2}}}{a_{n}^{2}}} \Biggr)}^{M/2}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( \vert {{Y}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}}+C\sum _{n=1}^{\infty}{{{ \Biggl( \sum _{i=1}^{n}{\frac{E{{\vert {{Y}_{ni}} \vert }^{2}}}{a_{n}^{2}}} \Biggr)}^{M/2}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( \vert {{X}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}}+C{{\sum _{n=1}^{\infty}{ \Biggl( \sum_{i=1}^{n}{ \frac{E{{\vert {{X}_{ni}} \vert }^{2}}}{a_{n}^{2}}} \Biggr)}}^{M/2}}< \infty. \end{aligned}$$
(3.8)

Note that \(\vert {{Z}_{ni}} \vert \le \vert {{X}_{ni}} \vert I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )\). By a standard argument, one has

$$\begin{aligned} \begin{aligned}[b] {{I}_{2}}&\le C\sum_{n=1}^{\infty}{ \sum_{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)}} \\ &=C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{EI \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)}} \\ &\le C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{ \vert {{X}_{ni}} \vert }^{q}}I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{a_{n}^{q}}}} \\ &\le C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( \vert {{X}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}}< \infty. \end{aligned} \end{aligned}$$
(3.9)

The proof of Theorem 2.1 is completed. □

Proof of Theorem 2.2

For all \(\varepsilon>0\) and any \(t\ge0\), since

$$\begin{aligned} \sum_{n=1}^{\infty}{a_{n}^{-q}}E \Biggl( \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert - \varepsilon{{a}_{n}} \Biggr)_{+}^{q} =&\sum _{n=1}^{\infty }{a_{n}^{-q} \int_{0}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert -\varepsilon {{a}_{n}}>{{t}^{1/q}} \Biggr) \,dt}} \\ =&\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{0}^{a_{n}^{q}}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >\varepsilon{{a}_{n}}+{{t}^{1/q}} \Biggr)\,dt}} \\ &{}+\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >\varepsilon{{a}_{n}}+{{t}^{1/q}} \Biggr)\,dt}} \\ \le& \sum_{n=1}^{\infty}{P \Biggl( \max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >\varepsilon{{a}_{n}} \Biggr)} \\ &{}+\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\,dt}} \\ \triangleq& {{J}_{1}}+{{J}_{2}}. \end{aligned}$$
(3.10)

By Theorem 2.1, one has \({{J}_{1}}<\infty\). To prove (2.3), one needs only to show that \({{J}_{2}}<\infty\). For any \(1\le i\le n\), \(n\ge1\), define

$$\begin{aligned}& {{Y}_{ni}}=-{{t}^{1/q}}I \bigl( {{X}_{ni}}< -{{t}^{1/q}} \bigr)+{{X}_{ni}}I \bigl( \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr)+{{t}^{1/q}}I \bigl( {{X}_{ni}}>{{t}^{1/q}} \bigr), \\& {{Z}_{ni}}={{X}_{ni}}-{{Y}_{ni}}= \bigl( {{X}_{ni}}+{{t}^{1/q}} \bigr)I \bigl( {{X}_{ni}}< -{{t}^{1/q}} \bigr)+ \bigl( {{X}_{ni}}-{{t}^{1/q}} \bigr)I \bigl( {{X}_{ni}}>{{t}^{1/q}} \bigr). \end{aligned}$$

It is easy to check that, for all \(\varepsilon>0\),

$$\begin{aligned} P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr) \le& P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{Y}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)+P \Biggl( \bigcup _{i=1}^{n}{ \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)} \Biggr) \\ \le& P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{Y}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)+\sum_{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)}. \end{aligned}$$
(3.11)

Hence,

$$\begin{aligned} {{J}_{2}} =&\sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\,dt}} \\ \le& \sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int _{a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)\,dt}}} \\ &{}+\sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{Y}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\,dt}} \\ \triangleq& {{J}_{21}}+{{J}_{22}}. \end{aligned}$$
(3.12)

For \({{J}_{21}}\), by conditions (2.1) and (1.6), one has

$$\begin{aligned} {{J}_{21}} =&\sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int_{a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)\,dt}}} \\ \le& \sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int _{0}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert I \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)>{{t}^{1/q}} \bigr)\,dt}}} \\ \le& \sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{E{{\vert {{X}_{ni}} \vert }^{q}}I \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)}} \\ \le& \sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}}< \infty. \end{aligned}$$
(3.13)

For \({{J}_{22}}\), we will first show that

$$ \max_{t\ge a_{n}^{q}} \frac {1}{{{t}^{1/q}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{E{{Y}_{ni}}} \Biggr\vert \to0\quad \text{as } n\to \infty. $$
(3.14)

Similar to the proof of (3.4), by conditions (1.5), (1.6), and (2.1), one has

$$\begin{aligned} \max_{t\ge a_{n}^{q}} \frac {1}{{{t}^{1/q}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{E{{Y}_{ni}}} \Biggr\vert =&\max_{t\ge a_{n}^{q}} \frac{1}{{{t}^{1/q}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{E{{Z}_{ni}}} \Biggr\vert \\ \le& C\max_{t\ge a_{n}^{q}} \frac {1}{{{t}^{1/q}}}\sum _{i=1}^{n}{E\vert {{Z}_{ni}} \vert } \\ \le& C\max_{t\ge a_{n}^{q}} \frac {1}{{{t}^{1/q}}}\sum _{i=1}^{n}{E\vert {{X}_{ni}} \vert I \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)} \\ \le& C\sum_{i=1}^{n}{\frac{E\vert {{X}_{ni}} \vert I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{{{a}_{n}}}} \\ \le& C\sum_{i=1}^{n}{\frac{E{{\vert {{X}_{ni}} \vert }^{q}}I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{a_{n}^{q}}} \\ \le& C\sum_{i=1}^{n}{\frac{E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}} \to0\quad \text{as }n\to\infty. \end{aligned}$$
(3.15)

Hence, while n is sufficiently large, for \(t\ge a_{n}^{q}\),

$$ \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{E{{Y}_{ni}}} \Biggr\vert \le\frac{{{t}^{1/q}}}{2}, $$

which implies

$$ P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum _{i=1}^{j}{{{Y}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\le P \Biggl(\max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert >\frac{{{t}^{1/q}}}{2} \Biggr). $$
(3.16)

For \({{J}_{22}}<\infty\), we will consider the following two cases. Let \({{d}_{n}}= [ {{a}_{n}} ]+1\).

(1) If \(1\le q< p\le2\), by (3.16), the \({{c}_{r}}\) inequality and Lemma 3.2, one has

$$\begin{aligned} {{J}_{22}} \le& \sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert >\frac{{{t}^{1/q}}}{2} \Biggr)\,dt}} \\ \le& C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{t}^{-2/q}}\sum _{i=1}^{n}{E{{ ( {{Y}_{ni}}-E{{Y}_{ni}} )}^{2}}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}} \int_{a_{n}^{q}}^{\infty }{EY_{ni}^{2}{{t}^{-2/q}} \,dt}} \\ =&C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}} \int _{a_{n}^{q}}^{\infty}{EX_{ni}^{2}I \bigl( \vert {{X}_{ni}} \vert \le {{d}_{n}} \bigr){{t}^{-2/q}}\,dt}} \\ &{}+C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}} \int _{a_{n}^{q}}^{\infty}{EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr){{t}^{-2/q}}\,dt}} \\ &{}+C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}} \int _{a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)\,dt}} \\ \triangleq& {{J}_{221}}+{{J}_{222}}+{{J}_{223}}. \end{aligned}$$
(3.17)

For \({{J}_{221}}\), by \(1\le q< p\le2\) and (1.6), one has

$$\begin{aligned} {{J}_{221}} =&C\sum_{n=1}^{\infty}{ \sum_{i=1}^{n}{a_{n}^{-q}} \int_{a_{n}^{q}}^{\infty}{EX_{ni}^{2}I \bigl( \vert {{X}_{ni}} \vert \le{{d}_{n}} \bigr){{t}^{-2/q}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac {EX_{ni}^{2}I ( \vert {{X}_{ni}} \vert \le{{d}_{n}} )}{a_{n}^{2}}}} \\ =&C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac {EX_{ni}^{2}I ( \vert {{X}_{ni}} \vert \le{{a}_{n}} )}{a_{n}^{2}}}}+C\sum _{n=1}^{\infty}{\sum_{i=1}^{n}{ \frac {EX_{ni}^{2}I ( {{a}_{n}}< \vert {{X}_{ni}} \vert \le{{d}_{n}} )}{a_{n}^{2}}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{ \vert {{X}_{ni}} \vert }^{p}}I ( \vert {{X}_{ni}} \vert \le {{a}_{n}} )}{a_{n}^{p}}}} \\ &{}+C\sum _{n=1}^{\infty}{\sum_{i=1}^{n}{{{ \biggl( \frac{{{a}_{n}}+1}{{{a}_{n}}} \biggr)}^{2}}\frac {E{{\vert {{X}_{ni}} \vert }^{2}}I ( \vert {{X}_{ni}} \vert \le {{d}_{n}} )}{d_{n}^{2}}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}}+C\sum _{n=1}^{\infty}{\sum_{i=1}^{n}{ \frac{E{{\vert {{X}_{ni}} \vert }^{p}}I ( \vert {{X}_{ni}} \vert \le{{d}_{n}} )}{d_{n}^{p}}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}}+C\sum _{n=1}^{\infty}{\sum_{i=1}^{n}{ \frac{E{{\psi }_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{d}_{n}} )}}} \\ \le& 2C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac {E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}}< \infty. \end{aligned}$$
(3.18)

For \({{J}_{222}}\), since

$$ \sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}} \int _{a_{n}^{q}}^{d_{n}^{q}}{EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr){{t}^{-2/q}}\,dt}}=0, $$

which implies

$$ {{J}_{222}}=C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}} \int_{d_{n}^{q}}^{\infty}{EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr){{t}^{-2/q}}\,dt}}. $$

Let \(t={{x}^{q}}\), by (2.1), (1.6), and \(1\le q<2\), one has

$$\begin{aligned} {{J}_{222}} =&C\sum_{n=1}^{\infty}{ \sum_{i=1}^{n}{a_{n}^{-q}q} \int_{{{d}_{n}}}^{\infty}{EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le x \bigr){{x}^{q-3}}\,dx}} \\ =&C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}}\sum _{m={{d}_{n}}}^{\infty}{ \int_{m}^{m+1}{EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le x \bigr){{x}^{q-3}}\,dx}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}}\sum _{m={{d}_{n}}}^{\infty }{EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le m+1 \bigr){{m}^{q-3}}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}}\sum _{m={{d}_{n}}}^{\infty }{{{m}^{q-3}}\sum _{j={{d}_{n}}}^{m}{EX_{ni}^{2}I \bigl( j< \vert {{X}_{ni}} \vert \le j+1 \bigr)}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}}\sum _{j={{d}_{n}}}^{\infty }{EX_{ni}^{2}I \bigl( j< \vert {{X}_{ni}} \vert \le j+1 \bigr)\sum _{m=j}^{\infty}{{{m}^{q-3}}}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}}\sum _{j={{d}_{n}}}^{\infty }{{{j}^{q-2}}EX_{ni}^{2}I \bigl( j< \vert {{X}_{ni}} \vert \le j+1 \bigr)}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{a_{n}^{-q}}E{{\vert {{X}_{ni}} \vert }^{q}}I \bigl( \vert {{X}_{ni}} \vert >{{d}_{n}} \bigr)} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{ \vert {{X}_{ni}} \vert }^{q}}I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{a_{n}^{q}}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}}< \infty. \end{aligned}$$
(3.19)

For \({{J}_{223}}\), by an argument similar to that in the proof of \({{J}_{21}}<\infty\), one can prove \({{J}_{223}}<\infty\). Therefore, one can obtain \({{J}_{22}}<\infty\) for \(1\le q< p\le2\).

(2) If \(1\le q< p\) and \(p>2\), by (3.16), the Markov inequality, Lemma 3.2, and the \({{c}_{r}}\) inequality, one has

$$\begin{aligned} \begin{aligned}[b] {{J}_{22}}&\le \sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert >\frac{{{t}^{1/q}}}{2} \Biggr)\,dt}} \\ &\le C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{t}^{-p/q}}E{{ \Biggl( \max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{ ( {{Y}_{ni}}-E{{Y}_{ni}} )} \Biggr\vert \Biggr)}^{p}}\,dt}} \\ &\le C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{t}^{-p/q}} \Biggl( \sum _{i=1}^{n}{E{{\vert {{Y}_{ni}} \vert }^{p}}+} {{ \Biggl( \sum_{i=1}^{n}{EY_{ni}^{2}} \Biggr)}^{p/2}} \Biggr)\,dt}} \\ &\le C\sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int _{a_{n}^{q}}^{\infty}{E{{ \vert {{Y}_{ni}} \vert }^{p}} {{t}^{-p/q}}\,dt}}}+C\sum _{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{{{t}^{-p/q}} {{ \Biggl( \sum _{i=1}^{n}{EY_{ni}^{2}} \Biggr)}^{p/2}}\,dt}} \\ &\triangleq {{K}_{1}}+{{K}_{2}}. \end{aligned} \end{aligned}$$
(3.20)

For \({{K}_{1}}\), one has

$$\begin{aligned} {{K}_{1}} =&C\sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int_{a_{n}^{q}}^{\infty}{E{{ \vert {{X}_{ni}} \vert }^{p}} {{t}^{-p/q}}I \bigl( \vert {{X}_{ni}} \vert \le{{d}_{n}} \bigr)\,dt}}} \\ &{}+C\sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int _{a_{n}^{q}}^{\infty}{E{{ \vert {{X}_{ni}} \vert }^{p}} {{t}^{-p/q}}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le {{t}^{1/q}} \bigr)\,dt}}} \\ &{}+C\sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int _{a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)\,dt}}} \\ \triangleq& {{K}_{11}}+{{K}_{12}}+{{K}_{13}}. \end{aligned}$$
(3.21)

By an argument similar to that in the proofs of \({{J}_{221}}\) and \({{J}_{222}}\) (replacing the exponent 2 by p), one easily has \({{K}_{11}}<\infty\) and \({{K}_{12}}<\infty\). Similarly, from the proof of \({{J}_{21}}<\infty\), one can obtain \({{K}_{13}}<\infty\).

For \({{K}_{2}}\), since \(p>2\), one has

$$\begin{aligned} {{K}_{2}} =&C\sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{{{t}^{-p/q}} {{ \Biggl( \sum _{i=1}^{n}{EY_{ni}^{2}} \Biggr)}^{p/2}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{t}^{-p/q}} {{ \Biggl( \sum _{i=1}^{n}{EX_{ni}^{2}I \bigl( \vert {{X}_{ni}} \vert \le{{a}_{n}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ &{}+C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{t}^{-p/q}} {{ \Biggl( \sum _{i=1}^{n}{EX_{ni}^{2}I \bigl( {{a}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ &{}+C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{ \Biggl( \sum _{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ \triangleq& {{K}_{21}}+{{K}_{22}}+{{K}_{23}}. \end{aligned}$$
(3.22)

For \({{K}_{21}}\), by \(p>q\), \(p>2\), and (1.9), one has

$$\begin{aligned} {{K}_{21}} =&C\sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{{{t}^{-p/q}} {{ \Biggl( \sum _{i=1}^{n}{EX_{ni}^{2}I \bigl( \vert {{X}_{ni}} \vert \le{{a}_{n}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{{{ \Biggl( \sum _{i=1}^{n}{\frac {EX_{ni}^{2}I ( \vert {{X}_{ni}} \vert \le{{a}_{n}} )}{a_{n}^{2}}} \Biggr)}^{p/2}}}< \infty. \end{aligned}$$
(3.23)

For \({{K}_{22}}\), we will consider the following two cases:

(1) When \(1\le q\le2\) and \(p>2\). By (2.1) and (1.6), one has

$$\begin{aligned} {{K}_{22}} =&C\sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{{{t}^{-p/q}} {{ \Biggl( \sum _{i=1}^{n}{EX_{ni}^{2}I \bigl( {{a}_{n}}< \vert {{X}_{ni}} \vert \le {{t}^{1/q}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ =&C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{ \Biggl( {{t}^{-2/q}}\sum _{i=1}^{n}{EX_{ni}^{2}I \bigl( {{a}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{a_{n}^{-q} \int_{a_{n}^{q}}^{\infty }{{{ \Biggl( {{t}^{-1}}\sum _{i=1}^{n}{E{{\vert {{X}_{ni}} \vert }^{q}}I \bigl( {{a}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{a_{n}^{-q}{{ \Biggl( \sum_{i=1}^{n}{E{{\vert {{X}_{ni}} \vert }^{q}}I \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)} \Biggr)}^{p/2}} \int_{a_{n}^{q}}^{\infty }{{{t}^{-p/2}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{{{ \Biggl( \sum _{i=1}^{n}{\frac {E{{\vert {{X}_{ni}} \vert }^{q}}I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{a_{n}^{q}}} \Biggr)}^{p/2}}} \\ \le& C{{ \Biggl( \sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac {E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}} \Biggr)}^{p/2}}< \infty. \end{aligned}$$
(3.24)

(2) When \(2< q< p\). By (2.1) and (1.6) again, one can have

$$\begin{aligned} {{K}_{22}} =&C\sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{{{ \Biggl( {{t}^{-2/q}}\sum _{i=1}^{n}{EX_{ni}^{2}I \bigl( {{a}_{n}}< \vert {{X}_{ni}} \vert \le {{t}^{1/q}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{a_{n}^{-q}{{ \Biggl( \sum_{i=1}^{n}{E{{\vert {{X}_{ni}} \vert }^{2}}I \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)} \Biggr)}^{p/2}} \int_{a_{n}^{q}}^{\infty }{{{t}^{-p/q}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{{{ \Biggl( \sum _{i=1}^{n}{\frac {E{{\vert {{X}_{ni}} \vert }^{2}}I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{a_{n}^{2}}} \Biggr)}^{p/2}}} \\ \le& C\sum_{n=1}^{\infty}{{{ \Biggl( \sum _{i=1}^{n}{\frac {E{{\vert {{X}_{ni}} \vert }^{q}}I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{a_{n}^{q}}} \Biggr)}^{p/2}}} \\ \le& C{{ \Biggl( \sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac {E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}} \Biggr)}^{p/2}}< \infty. \end{aligned}$$
(3.25)

For \({{K}_{23}}\), by (2.1), it follows that \({{\psi}_{i}} ( \vert t \vert )\uparrow\) as \(\vert t \vert \uparrow\). By (1.6), one has

$$ \sup_{t\ge a_{n}^{q}} \sum_{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)}\le\sum _{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)}\le\sum _{i=1}^{n}{\frac{E{{\psi}_{i}} ( \vert {{X}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}\to0 \quad \text{as } n \to\infty. $$

Hence, while n is sufficiently large, for \(t\ge a_{n}^{q}\), one can have

$$ \sum_{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)}< 1. $$

By (3.13), it follows that

$$\begin{aligned} {{K}_{23}} =&C\sum_{n=1}^{\infty}{a_{n}^{-q} \int _{a_{n}^{q}}^{\infty}{{{ \Biggl( \sum _{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)} \Biggr)}^{p/2}}\,dt}} \\ \le& C\sum_{n=1}^{\infty}{a_{n}^{-q} \sum_{i=1}^{n}{ \int _{a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)\,dt}}} \\ \le& C\sum_{n=1}^{\infty}{\sum _{i=1}^{n}{\frac{E{{\psi }_{i}} ( \vert {{X}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}}< \infty. \end{aligned}$$
(3.26)

The proof of Theorem 2.2 is completed. □

Proof of Theorem 2.3

Following the notations in the proof of Theorem 2.2, we will first prove (2.5) for the case of \(1< p\le2\). By (3.11), for all \(\varepsilon>0\), one has

$$\begin{aligned} E{{ \Biggl( \frac{1}{{{a}_{n}}}\max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert \Biggr)}^{q}} =&\frac {1}{a_{n}^{q}} \int_{0}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\,dt} \\ =&\frac{1}{a_{n}^{q}} \int_{0}^{\varepsilon a_{n}^{q}}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\,dt} \\ &{}+\frac{1}{a_{n}^{q}} \int_{\varepsilon a_{n}^{q}}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{X}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\,dt} \\ \le& \varepsilon+\frac{1}{a_{n}^{q}} \int_{\varepsilon a_{n}^{q}}^{\infty}{P \Biggl( \max_{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{{{Y}_{ni}}} \Biggr\vert >{{t}^{1/q}} \Biggr)\,dt} \\ &{}+\frac{1}{a_{n}^{q}} \int_{\varepsilon a_{n}^{q}}^{\infty}{\sum_{i=1}^{n}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)}\,dt} \\ \triangleq& \varepsilon+{{L}_{1}}+{{L}_{2}}. \end{aligned}$$
(3.27)

Without loss of generality, one may assume \(0<\varepsilon<1\). For \({{L}_{2}}\), by the Markov inequality, (2.1), and (2.4), one has

$$\begin{aligned} {{L}_{2}} \le& \sum_{i=1}^{n}{ \frac{1}{a_{n}^{q}} \int _{\varepsilon a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert I \bigl( \varepsilon a_{n}^{q}< \vert {{X}_{ni}} \vert \le{{a}_{n}} \bigr)>{{t}^{1/q}} \bigr)\,dt}} \\ &{}+\sum_{i=1}^{n}{\frac{1}{a_{n}^{q}} \int_{\varepsilon a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert I \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)>{{t}^{1/q}} \bigr)\,dt}} \\ \le& \sum_{i=1}^{n}{\frac{1}{a_{n}^{q}}E{{ \vert {{X}_{ni}} \vert }^{p}}I \bigl( \varepsilon a_{n}^{q}< \vert {{X}_{ni}} \vert \le {{a}_{n}} \bigr) \int_{\varepsilon a_{n}^{q}}^{\infty }{{{t}^{-p/q}}\,dt}} \\ &{}+\sum_{i=1}^{n}{\frac{1}{a_{n}^{q}} \int_{0}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert I \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)>{{t}^{1/q}} \bigr)\,dt}} \\ \le& C{{\varepsilon}^{1-\frac{p}{q}}}\sum_{i=1}^{n}{ \frac {E{{\vert {{X}_{ni}} \vert }^{p}}I ( \vert {{X}_{ni}} \vert \le {{a}_{n}} )}{a_{n}^{p}}}+\sum_{i=1}^{n}{ \frac {1}{a_{n}^{q}}E{{\vert {{X}_{ni}} \vert }^{q}}I \bigl( \vert {{X}_{ni}} \vert >{{a}_{n}} \bigr)} \\ \le& C\sum_{i=1}^{n}{\frac{E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}} \to0 \quad \text{as }n\to\infty. \end{aligned}$$
(3.28)

Similar to the proof of (3.15), by conditions (2.4), (2.1), and (1.5), one has

$$\begin{aligned} \max_{t\ge\varepsilon a_{n}^{q}} \frac {1}{{{t}^{1/q}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{E{{Y}_{ni}}} \Biggr\vert =&\max_{t\ge\varepsilon a_{n}^{q}} \frac{1}{{{t}^{1/q}}}\max _{1\le j\le n} \Biggl\vert \sum_{i=1}^{j}{E{{Z}_{ni}}} \Biggr\vert \\ \le& C\max_{t\ge\varepsilon a_{n}^{q}} \frac {1}{{{t}^{1/q}}}\sum _{i=1}^{n}{E\vert {{Z}_{ni}} \vert } \\ \le& C\max_{t\ge a_{n}^{q}} \frac {1}{{{t}^{1/q}}}\sum _{i=1}^{n}{E\vert {{X}_{ni}} \vert I \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)} \\ \le& C{{\varepsilon}^{-1/q}}\sum_{i=1}^{n}{ \frac{E{{\vert {{X}_{ni}} \vert }^{q}}I ( \vert {{X}_{ni}} \vert >{{a}_{n}} )}{a_{n}^{q}}} \\ &{}+C{{\varepsilon}^{-p/q}}\sum_{i=1}^{n}{ \frac{E{{\vert {{X}_{ni}} \vert }^{p}}I ( {{\varepsilon}^{1/q}}{{a}_{n}}< \vert {{X}_{ni}} \vert \le{{a}_{n}} )}{a_{n}^{p}}} \\ \le& C \bigl( {{\varepsilon}^{-1/q}}+{{\varepsilon}^{-p/q}} \bigr)\sum_{i=1}^{n}{\frac{E{{\psi}_{i}} ( {{X}_{ni}} )}{{{\psi}_{i}} ( {{a}_{n}} )}}\to0 \quad \text{as }n\to \infty. \end{aligned}$$
(3.29)

Hence, while n is sufficiently large, (3.16) holds uniformly for \(t\ge\varepsilon a_{n}^{q}\).

For \({{L}_{1}}\), let \({{d}_{n}}= [ {{a}_{n}} ]+1\), by (3.16), the Markov inequality, Lemma 3.2, and the \({{c}_{r}}\) inequality, one has

$$\begin{aligned} {{L}_{1}} \le& C\sum_{i=1}^{n}{ \frac{1}{a_{n}^{q}}} \int _{\varepsilon a_{n}^{q}}^{\infty}{{{t}^{-2/q}}E{{ ( {{Y}_{ni}}-E{{Y}_{ni}} )}^{2}}\,dt} \\ \le& C\sum_{i=1}^{n}{\frac{1}{a_{n}^{q}}} \int_{\varepsilon a_{n}^{q}}^{\infty}{{{t}^{-2/q}}EY_{ni}^{2} \,dt} \\ =&C\sum_{i=1}^{n}{\frac{1}{a_{n}^{q}}} \int_{\varepsilon a_{n}^{q}}^{\infty}{{{t}^{-2/q}}EX_{ni}^{2}I \bigl( \vert {{X}_{ni}} \vert \le{{d}_{n}} \bigr)\,dt} \\ &{}+C\sum_{i=1}^{n}{\frac{1}{a_{n}^{q}}} \int_{\varepsilon a_{n}^{q}}^{\infty}{{{t}^{-2/q}}EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr)\,dt} \\ &{}+C\sum_{i=1}^{n}{\frac{1}{a_{n}^{q}}} \int_{\varepsilon a_{n}^{q}}^{\infty}{P \bigl( \vert {{X}_{ni}} \vert >{{t}^{1/q}} \bigr)\,dt} \\ \triangleq& {{L}_{11}}+{{L}_{12}}+{{L}_{13}}. \end{aligned}$$
(3.30)

By (3.28), one has \({{L}_{13}}\to0\). For \({{L}_{11}}\), by an argument similar to that in the proof of \({{J}_{221}}<\infty\) and (2.4), one can obtain

$$ {{L}_{11}}=C\sum_{i=1}^{n}{ \frac{1}{a_{n}^{q}}} \int _{\varepsilon a_{n}^{q}}^{\infty}{{{t}^{-2/q}}EX_{ni}^{2}I \bigl( \vert {{X}_{ni}} \vert \le{{d}_{n}} \bigr)\,dt} \le C\sum_{i=1}^{n}{\frac{E{{\psi}_{i}} ( \vert {{X}_{ni}} \vert )}{{{\psi}_{i}} ( {{a}_{n}} )}}\to0 \quad \text{as } n\to \infty. $$

For \({{L}_{12}}\),

$$ \sum_{i=1}^{n}{\frac{1}{a_{n}^{q}}} \int_{\varepsilon a_{n}^{q}}^{d_{n}^{q}}{{{t}^{-2/q}}EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr)\,dt}=0, $$

which implies

$$ {{L}_{12}}=C\sum_{i=1}^{n}{ \frac{1}{a_{n}^{q}}} \int _{d_{n}^{q}}^{\infty}{{{t}^{-2/q}}EX_{ni}^{2}I \bigl( {{d}_{n}}< \vert {{X}_{ni}} \vert \le{{t}^{1/q}} \bigr)\,dt}. $$

Similarly, by an argument similar to the proof of \({{J}_{222}}<\infty\) and (2.4), one also has \({{L}_{12}}\to0\) as \(n\to\infty\).

The proof of (2.5) for the case of \(p>2\) is similar to that of \(1\le q< p\) and \(p>2\) in Theorem 2.2, so we omit the details. The proof of Theorem 2.3 is completed. □