1 Introduction and main results

Let \(\{\Omega,\mathfrak{F},P\}\) ba a complete probability space. The random variables we deal with are all defined on \(\{\Omega,\mathfrak {F},P\}\). The concept of complete convergence of a sequence of random variables was introduced by Hsu and Robbins [1] as follows. A sequence \(\{U_{n}, n\ge1\}\) of random variables converges completely to the constant θ if

$$\sum_{n=1}^{\infty}P\bigl(|U_{n}-\theta|> \epsilon\bigr)< \infty \quad\mbox{for all } \epsilon> 0. $$

Moreover, they proved that the sequence of arithmetic means of independent identically distribution (i.i.d.) random variables converges completely to the expected value if the variance of the summands is finite. This result has been generalized and extended in several directions, one can refer to [210] and so on.

Chow [11] first investigated the complete moment convergence, which is more exact than complete convergence. He obtained the following result: Let \(p>1/\alpha\) and \(1/2<\alpha\le1\). Let \(\{X, X_{n},n\ge1\}\) be a sequence of i.i.d. random variables. Assume that \(EX=0\) and \(E\{|X|^{p}+|X|\log(1+|X|)\}<\infty\). Then

$$\sum_{n=1}^{\infty}n^{(p-1)\alpha-2}E \Biggl\{ \Biggl\vert \sum_{k=1}^{n} X_{k}\Biggr\vert -\epsilon n^{\alpha}\Biggr\} _{+}< \infty \quad\mbox{for all } \epsilon>0, $$

where \(x_{+}=\max\{0, x\}\). Chow’s result has been generalized and extended in several directions, please see [1218] and so forth.

Definition 1.1

A sequence of random variables \(\{ X_{n}, n\ge1\} \) is said to be extended negatively dependent (END) if there exists a constant \(M\ge1\) such that for each \(n \ge2\),

$$P(X_{1}\le x_{1}, \ldots, X_{n}\le x_{n})\le M\prod_{i=1}^{n} P(X_{i}\le x_{i}) $$

and

$$P(X_{1}> x_{1}, \ldots, X_{n}> x_{n})\le M\prod_{i=1}^{n} P(X_{i}> x_{i}) $$

hold for every sequence \(\{x_{1}, \ldots, x_{n}\}\) of real numbers.

The concept was introduced by Liu [19]. When \(M=1\), the notion of END random variables reduces to the well-known notion of so-called negatively dependent (ND) random variables which was introduced by Alam and Saxena [20], Block et al. [21], Joag-Dev and Proschan [22]. As mentioned by Liu [19], the END structure is substantially more comprehensive than the ND structure in that it can reflect not only a negative dependence structure but also a positive one, to some extent. Liu [19] pointed out that the END random variables can be taken as negatively or positively dependent and provided some interesting examples to support this idea. Joag-Dev and Proschan [22] also pointed out that negatively associated (NA) random variables must be ND and ND is not necessarily NA, thus NA random variables are END. A great numbers of articles for ND random variables have appeared in the literature. For further research on END random variables, please see [5, 79, 2328] and so on.

The aim of this paper is to extend and improve Chow’s result in i.i.d. case to extended negatively dependent (END) random variables. Some new sufficient conditions of complete moment convergence results for product sums of sequence of extended negatively dependent (END) random variables are obtained.

Definition 1.2

A sequence of random variables \(\{ X_{n}, n\ge1\} \) is said to be stochastically dominated by a random variable X in Cesàro meaning if there exists a constant \(D>0\) such that for all \(x>0\), \(n\ge1\),

$$\sum_{i=1}^{n} P\bigl(|X_{i}|>x\bigr)\le Dn P\bigl(|X|>x\bigr). $$

In this case we write \(\{X_{n},n\ge1\}\prec X\).

Now we state the main results, some lemmas will be given in Section 2 and the proofs of the main results will be given in Section 3.

Theorem 1.1

Let \(\alpha>1/2\), \(p>1/\alpha\), m be a positive integer. Let \(\{X_{n},n\ge1\}\) be a sequence of END random variables with \(\{X_{n},n\ge1\}\prec X\). Moreover, additionally assume that for \(\alpha\le1\), \(EX_{n}=0\) for all \(n\ge1\). If

$$ E|X|^{p}< \infty. $$
(1.1)

Then the following statements hold:

$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha p-2 } P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\epsilon n^{m\alpha} \Biggr)< \infty \quad\textit{for all } \epsilon>0, \end{aligned}$$
(1.2)
$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha p-2 } P \Biggl(\sup_{k\ge n} k^{-m\alpha }\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\epsilon \Biggr)< \infty \quad\textit{for all } \epsilon>0. \end{aligned}$$
(1.3)

Theorem 1.2

Let \(q>0\), \(\alpha>1/2\), \(p>1/\alpha\), m, v be positive integers such that \(1\le v\le m\). Let \(\{X_{n},n\ge1\}\) be a sequence of END random variables with \(\{X_{n},n\ge1\}\prec X\). Moreover, additionally assume that for \(\alpha\le1\), \(EX_{n}=0\) for all \(n\ge1\). Assume

$$ \textstyle\begin{cases} E|X|^{p}< \infty,& mq< p,\\ E|X|^{p}\log(1+|X|)< \infty,& mq=p,\\ E|X|^{mq}< \infty,& mq>p. \end{cases} $$
(1.4)

Then the following statements hold:

$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 } E \Biggl\{ \max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k} \prod_{j=1}^{m} X_{i_{j}}\Biggr\vert -\epsilon n^{m\alpha} \Biggr\} _{+}^{q}< \infty\quad\textit{for all } \epsilon>0, \end{aligned}$$
(1.5)
$$\begin{aligned}& \sum_{n=m}^{\infty}n^{\alpha p-2 } E \Biggl\{ \sup_{ k \ge n} k^{-m\alpha }\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert -\epsilon \Biggr\} _{+}^{q}< \infty \quad\textit{for all } \epsilon>0, \end{aligned}$$
(1.6)

where \(x_{+}^{q}=(x_{+})^{q}\).

Throughout this paper, C will represent positive constant not depending on n which may change from one place to another, the symbol ♯A denotes the number of elements in the set A.

2 Lemmas

In order to prove our main result, we need the following lemmas.

Lemma 2.1

(Liu [19])

Let \(X_{1},X_{2},\ldots,X_{n}\) be END random variables. Assume that \(f_{1},f_{2},\ldots,f_{n}\) are Borel functions all of which are monotone increasing (or monotone decreasing). Then \(f_{1}(X_{1}),f_{2}(X_{2}),\ldots,f_{n}(X_{n})\) are END random variables.

Lemma 2.2

(Shen [26])

For any \(s\ge2\), there is a positive constant \(C_{s}\) depending only on s such that if \(\{X_{n}, n\ge1\}\) is a sequence of END random variables with \(EX_{n}=0\) for every \(n\ge1\), then for all \(n\ge1\),

$$E\Biggl\vert \sum_{j=1}^{n}X_{j} \Biggr\vert ^{s} \le C_{s} \Biggl\{ \sum _{j=1}^{n} E|X_{j}|^{s}+ \Biggl(\sum_{j=1}^{n }E|X_{j}|^{2} \Biggr)^{s/2} \Biggr\} . $$

By Lemma 2.2 and the same argument as Theorem 2.3.1 in Stout [29], the following lemma holds.

Lemma 2.3

For any \(s\ge2\), there is a positive constant \(C_{s}\) depending only on s such that if \(\{X_{n}, n\ge1\}\) is a sequence of END random variables with \(EX_{n}=0\) for every \(n\ge1\), then for all \(n\ge1\),

$$E\max_{1\le k\le n}\Biggl\vert \sum_{j=1}^{k}X_{j} \Biggr\vert ^{s} \le C_{s} \bigl(\log (4n) \bigr)^{s} \Biggl\{ \sum_{j=1}^{n} E|X_{j}|^{s}+ \Biggl(\sum_{j=1}^{n}E|X_{j}|^{2} \Biggr)^{s/2} \Biggr\} . $$

Lemma 2.4

(Kuczmaszewska [4])

Let s, x be positive constants. Let \(\{X_{n},n\ge1\}\) be a sequence of random variables with \(\{X_{n},n\ge1\}\prec X\).

  1. (i)

    If \(E|X|^{s}<\infty\), then \(\frac{1}{n}\sum_{j=1}^{n} E|X_{j}|^{s} \le CE|X|^{s} \);

  2. (ii)

    \(\frac{1}{n}\sum_{j=1}^{n} E|X_{j}|^{s} I(|X_{j}|\le x)\le C \{E|X|^{s} I(|X|\le x)+x^{s} P(|X|>x) \}\);

  3. (iii)

    \(\frac{1}{n}\sum_{j=1}^{n} E|X_{j}|^{s} I(|X_{j}|>x) \le C E|X|^{s} I(|X|>x) \).

Lemma 2.5

(Wang et al. [30])

Let m, n be positive integers such that \(1\le m\le n\) and \(\{x_{n}, n\ge1\}\) be a sequence of real numbers. Then

$$\sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le n } \prod_{k=1}^{m} x_{i_{k}}=\sum_{\sum _{k=1}^{m}r_{k}s_{k}=m} A(m,r_{k},s_{k}:k=1, \ldots,m)\prod_{k=1}^{m}\Biggl(\sum _{j=1}^{n} x_{j}^{r_{k}} \Biggr)^{s_{k}}, $$

where \(A(m,r_{k},s_{k}:k=1,\ldots,m)\) are constants and \(r_{k}\), \(s_{k}\) are positive integers depending only on m.

Lemma 2.6

Let α, β be positive constants such that \(\alpha+\beta=1\). Let X, Y be random variables. Then for all \(\varepsilon>0\),

$$\bigl(|X+Y|>\varepsilon\bigr)\subseteq\bigl(|X|> \alpha\varepsilon\bigr) \cup\bigl(|Y|>\beta \varepsilon\bigr),\qquad \bigl(|XY|>\varepsilon\bigr)\subseteq\bigl(|X|>\varepsilon^{\alpha}\bigr)\cup \bigl(|Y|>\varepsilon^{\beta}\bigr). $$

Proof

Note that \(\omega\notin (|X|> \alpha\varepsilon )\cup(|Y|>\beta\varepsilon)\), we have \(\omega\notin (|X|> \alpha \varepsilon)\) and \(\omega\notin (|Y|>\beta\varepsilon)\). Since \(|X(\omega)+Y(\omega)|\le|X(\omega)|+|Y(\omega)|\le \alpha\varepsilon +\beta\varepsilon= \varepsilon\), we have \(\omega\notin (|X+Y|>\varepsilon)\). Thus, \((|X+Y|>\varepsilon)\subseteq(|X|> \alpha \varepsilon)\cup(|Y|>\beta\varepsilon)\) holds. Similarly, \((|XY|>\varepsilon)\subseteq(|X|>\varepsilon^{\alpha})\cup (|Y|>\varepsilon^{\beta})\) is true. □

Lemma 2.7

Let \(\alpha>0\), \(p>0\), \(q>0\), \(0<\gamma<1\), m be a positive integer. Let \(\{X_{n},n\ge1\}\) be a sequence of random variables. If \(s>\max\{p,mq\}/(1-\gamma)\), then the following statements hold:

$$\begin{aligned} & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha }}^{\infty}t^{-(1-\gamma)s/(mq)} \Biggl( \sum_{j=1}^{n} P \bigl(|X_{j}|>t^{\gamma /(mq)}\bigr) \Biggr)\,dt< \infty, \\ & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha }}^{\infty}t^{-s/(mq)} \Biggl( \sum_{j=1}^{n}E|X_{j}|^{s}I \bigl(|X_{j}|\le t^{\gamma/(mq)}\bigr) \Biggr)\,dt < \infty. \end{aligned}$$

Proof

Since \(s>\max\{p,mq\}/(1-\gamma)\), we obtain

$$\begin{aligned} &\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-(1-\gamma)s/(mq)} \Biggl( \sum_{j=1}^{n} P \bigl(|X_{j}|>t^{\gamma/(mq)}\bigr) \Biggr)\,dt \\ & \quad\le \sum_{n=m}^{\infty}n^{\alpha( p-mq)-1 }\bigl(\log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-(1-\gamma)s/(mq)}\,dt \\ & \quad= \sum_{n=m}^{\infty}n^{\alpha p-(1-\gamma)s\alpha-1 } \bigl(\log(4n)\bigr)^{s} \\ &\quad < \infty \end{aligned}$$

and

$$\begin{aligned} & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\bigl( \log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-s/(mq)} \Biggl( \sum_{j=1}^{n}E|X_{j}|^{s}I \bigl(|X_{j}|\le t^{\gamma/(mq)}\bigr) \Biggr)\,dt \\ &\quad\le \sum_{n=m}^{\infty}n^{\alpha( p-mq)-1 }\bigl(\log(4n)\bigr)^{s}\int_{n^{mq \alpha}}^{\infty}t^{-(1-\gamma)s/(mq)}\,dt < \infty. \end{aligned}$$

 □

Lemma 2.8

Let \(\alpha>0\), \(p>0\), \(q>0\), \(0<\gamma<1\), m, v be positive integers such that \(1\le v\le m\). Let \(\{X_{n},n\ge1\}\) be a sequence of random variables with \(\{X_{n},n\ge1\} \prec X\). Moreover, additionally assume that for \(\alpha\le1\), \(EX_{n}=0\) for all \(n\ge1\). If (1.4) holds, then the following statements hold:

$$\begin{aligned} & \sum_{n=m}^{\infty}n^{\alpha( p-m q)-2 }\int _{n^{mq \alpha}}^{\infty}\Biggl( \sum _{j=1}^{n} P\bigl(|X_{j}|>t^{1/(mq)} \bigr) \Biggr)\,dt < \infty, \\ & \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int _{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl( \sum _{j=1}^{n}E|X_{j}|^{vs}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr) \Biggr)\,dt< \infty\\ &\quad\textit{for all } s>\max\{p,mq\}/v. \end{aligned}$$

Proof

By the mean-value theorem, a standard computation and (1.4), we have

$$\begin{aligned} &\sum_{n=m}^{\infty}n^{\alpha( p-m q)-2 }\int _{n^{mq \alpha }}^{\infty}\Biggl( \sum _{j=1}^{n} P\bigl(|X_{j}|>t^{1/(mq)} \bigr) \Biggr)\,dt \\ &\quad \le C\sum_{n=m}^{\infty}n^{\alpha( p-m q)- 1}\int_{n^{mq \alpha }}^{\infty}P \bigl(|X|>t^{1/(mq)}\bigr)\,dt \\ &\quad = C\sum_{n=m}^{\infty} n^{\alpha( p-m q)-1 } \sum_{k=n}^{\infty}\int_{k^{mq \alpha}}^{(k+1)^{mq \alpha}} P\bigl(|X|> t^{1/(mq}\bigr)\,dt \\ &\quad \le C\sum_{n=1}^{\infty} n^{\alpha( p-m q)-1 } \sum_{k=n}^{\infty}k^{mq \alpha-1} P\bigl(|X|> k^{\alpha}\bigr) \\ &\quad = C\sum_{k=1}^{\infty}k^{mq \alpha-1} P\bigl(|X|> k^{\alpha}\bigr) \sum _{n=1}^{k} n^{\alpha( p-m q)-1 } \\ &\quad \le \textstyle\begin{cases} C\sum_{k=1}^{\infty}k^{\alpha p-1} P(|X|> k^{\alpha}) , & \mbox{if } mq< p ,\\ C\sum_{k=1}^{\infty}k^{\alpha p-1}\log(1+k) P(|X|> k^{\alpha}) ,& \mbox{if } mq=p ,\\ C\sum_{k=1}^{\infty}k^{mq \alpha-1} P(|X|> k^{\alpha}) ,&\mbox{if } mq>p \end{cases}\displaystyle \displaystyle \\ &\quad \le \textstyle\begin{cases} C E|X|^{p},& \mbox{if } mq< p ,\\ CE|X|^{p}\log(1+|X|),& \mbox{if } mq=p,\\ CE|X|^{mq},& \mbox{if } mq>p \end{cases}\displaystyle \displaystyle \displaystyle \displaystyle \displaystyle \\ & \quad< \infty. \end{aligned}$$

Since \(s>\max\{p,mq\}/v\), by Lemma 2.4(ii), the mean-value theorem, a standard computation, (1.4), and the argument above, we also have

$$\begin{aligned} &\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int _{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \Biggl( \sum _{j=1}^{n}E|X_{j}|^{vs}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr) \Biggr)\,dt \\ &\quad \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-1 }\int_{n^{mq \alpha }}^{\infty}\bigl( t^{-vs/(mq)} E|X|^{vs}I\bigl(|X|\le t^{1/(mq)}\bigr)+P \bigl(|X|>t^{1/(mq)}\bigr) \bigr)\,dt \\ &\quad \le C\sum_{n=1}^{\infty} n^{\alpha( p-mq)-1} \sum_{k=n}^{\infty}\int _{k^{mq \alpha}}^{(k+1)^{mq \alpha}} x^{-v s/(mq)} E|X|^{vs}I \bigl(|X|\le x^{1/(mq)}\bigr)\,dx+C \\ & \quad\le C\sum_{n=1}^{\infty} n^{\alpha( p-mq)-1} \sum_{k=n}^{\infty}k^{mq \alpha-vs\alpha-1} E|X|^{v s}I\bigl(|X|\le(k+1)^{\alpha}\bigr)+C \\ &\quad \le C\sum_{k=1}^{\infty}k^{mq \alpha-vs\alpha-1} E|X|^{v s}I\bigl(|X|\le (k+1)^{\alpha}\bigr) \sum _{n=1}^{k} n^{\alpha( p-mq)-1}+C \\ & \quad\le \textstyle\begin{cases} C\sum_{k=1}^{\infty} k^{\alpha p-v s\alpha-1} E|X|^{v s} I(|X|\le k^{\alpha})+C ,& \mbox{if } mq< p ,\\ C\sum_{k=1}^{\infty} k^{\alpha p-v s \alpha-1}\log(1+k) E|X|^{v s}I(|X|\le k^{\alpha})+C,&\mbox{if } mq=p ,\\ C\sum_{k=1}^{\infty} k^{mq \alpha-vs\alpha-1} E|X|^{v s}I(|X|\le k^{\alpha})+C,& \mbox{if } mq>p \end{cases}\displaystyle \displaystyle \\ &\quad \le \textstyle\begin{cases} C E|X|^{p}+C,& \mbox{if } mq< p ,\\ CE|X|^{p}\log(1+|X|)+C,& \mbox{if } mq=p,\\ CE|X|^{mq}+C,& \mbox{if } mq>p \end{cases}\displaystyle \displaystyle \displaystyle \displaystyle \displaystyle \\ &\quad < \infty. \end{aligned}$$

 □

3 Proofs

Proof of Theorem 1.1

First, we prove (1.2). By Lemma 2.5, Lemma 2.6, and the Jensen inequality (see Ragusa and Tachikawa [31]), in order to prove (1.2), it suffices to show that

$$ \sum_{n=m}^{\infty}n^{\alpha p-2} P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{j=1}^{k} X_{j}\Biggr\vert >\epsilon n^{\alpha} \Biggr)< \infty,\quad \forall \varepsilon>0, $$
(3.1)

and

$$ \sum_{n=m}^{\infty}n^{\alpha p-2}P \Biggl( \max_{m\le k\le n}\sum_{j=1}^{k} X_{j}^{2}>\epsilon n^{2\alpha} \Biggr)< \infty, \quad\forall \varepsilon>0. $$
(3.2)

Utilizing similar method to the proof of Theorem 2.1 of Qiu et al. [6] we can prove (3.1). Now, we prove (3.2). Since \(X_{n}^{2}=X_{n}^{2}I(X_{n}<0)+X_{n}^{2}I(X_{n}\ge0)\), thus, without loss of generality, we assume that \(X_{n}\ge0\), \(n\ge1\). Note that \(\{X_{n}^{2},n\ge 1\}\prec X^{2}\), \(E(X^{2})^{p/2}= E|X|^{p}<\infty\), and \(2\alpha>1\), hence, (3.2) holds by Lemma 2.1 and (3.1). Therefore, (1.2) holds.

Next, we prove (1.3). For any fixed positive integer m, there exists a positive integer \(i_{0}\) such that \(2^{i_{0}-1}\le m<2^{i_{0}}\). Thus, by (1.2) and \(\alpha p>1\) we have

$$\begin{aligned} &\sum_{n=2^{i_{0}}}^{\infty} n^{\alpha p-2} P \Biggl( \sup_{ k\ge n}k^{-m/p}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon \Biggr) \\ &\quad = \sum_{i=i_{0}}^{\infty} \sum _{n=2^{i}}^{2^{i+1}-1}n^{\alpha p-2} P \Biggl(\sup _{ k\ge n}k^{-m/p}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon \Biggr) \\ &\quad \le C\sum_{i=i_{0}}^{\infty} 2^{i(\alpha p-1)} \sum_{l=i}^{\infty}P \Biggl( \max_{ 2^{l}\le k< 2^{l+1}}k^{-m/p}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots < i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon \Biggr) \\ & \quad\le C\sum_{l=i_{0}}^{\infty} 2^{l(\alpha p-1)} P \Biggl(\max_{ 2^{l}\le k< 2^{l+1}}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon2^{ml/p} \Biggr) \\ &\quad \le C\sum_{l=i_{0}}^{\infty} 2^{l(\alpha p-1)} P \Biggl(\max_{m\le k< 2^{l+1}}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon2^{ml/p} \Biggr) \\ & \quad\le C\sum_{n=m}^{\infty} n^{\alpha p-2} P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon_{0} n^{m/p} \Biggr) < \infty, \end{aligned}$$

where \(\varepsilon_{0}=2^{-m/p}\varepsilon\). Therefore, (1.3) holds. □

Proof of Theorem 1.2

First, we prove (1.5). Denote \(h(n)=n^{\alpha( p-mq)-2 }\). Note that for all \(\varepsilon>0\),

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n) E \Biggl\{ \max _{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k} \prod_{j=1}^{m} X_{i_{j}}\Biggr\vert -\epsilon n^{m\alpha} \Biggr\} _{+}^{q} \\ &\quad= \sum_{n=m}^{\infty}h(n)\int _{0}^{\infty}P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert - \varepsilon n^{m\alpha}>t^{1/q} \Biggr)\,dt \\ &\quad= \sum_{n=m}^{\infty}h(n)\int _{0}^{n^{m q \alpha}} P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert - \varepsilon n^{m\alpha}>t^{1/q} \Biggr)\,dt \\ &\qquad{} + \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert - \varepsilon n^{m\alpha}>t^{1/q} \Biggr)\,dt \\ &\quad\le \sum_{n=m}^{\infty}n^{\alpha p-2 } P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >\varepsilon n^{m\alpha} \Biggr) \\ &\qquad{} + \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k \le n} \Biggl\vert \sum_{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod _{j=1}^{m} X_{i_{j}}\Biggr\vert >t^{1/q} \Biggr)\,dt. \end{aligned}$$

By Lemma 2.5 and Lemma 2.6, we have

$$\begin{aligned} &P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{1\le i_{1}< i_{2}< \cdots< i_{m}\le k}\prod_{j=1}^{m} X_{i_{j}}\Biggr\vert >t^{1/q} \Biggr) \\ &\quad\le\sum_{\sum_{j=1}^{m}r_{j}s_{j}=m} P \Biggl(\max _{m\le k\le n}\Biggl\vert \prod_{j=1}^{m} \Biggl(\sum_{i=1}^{k} X_{i}^{r_{j}} \Biggr)^{s_{j}}\Biggr\vert > \bigl(b A(m,r_{j},s_{j}:j=1, \ldots,m) \bigr)^{-1}t^{1/q} \Biggr) \\ &\quad \le\sum_{\sum_{j=1}^{m}r_{j}s_{j}=m}\sum ^{m}_{j=1} P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{i=1}^{k} X_{i}^{r_{j}}\Biggr\vert ^{s_{j}}> \bigl(bA(m,r_{j},s_{j}:j=1,\ldots,m) \bigr)^{-r_{j}s_{j}/m}t^{r_{j}s_{j}/(mq)} \Biggr) \\ &\quad =\sum_{\sum_{j=1}^{m}r_{j}s_{j}=m}\sum ^{m}_{j=1} P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{i=1}^{k} X_{i}^{r_{j}}\Biggr\vert > \bigl(bA(m,r_{j},s_{j}:j=1, \ldots ,m) \bigr)^{-r_{j}/m}t^{r_{j}/(mq)} \Biggr), \end{aligned}$$

where \(b=\sharp\{(r_{j},s_{j}:j=1,\ldots,m):\sum_{j=1}^{m}r_{j}s_{j}=m\}\). Obviously, b is a constant depending only on m. Therefore, in order to prove (1.5), by Theorem 1.1 and the above inequality, it is enough to show that for all integers \(v: 1\le v\le m\) we have

$$ \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j}^{v} \Biggr\vert >\varepsilon t^{v/(mq)} \Biggr)\,dt< \infty\quad\mbox{for all }\varepsilon>0. $$
(3.3)

We first prove that for any fixed positive integer \(n\ge m\), \(\forall \varepsilon>0\), \(1\le v\le m\),

$$\begin{aligned} M_{n}&:=\int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} \bigl( |X_{j}|^{v}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr)+t^{v/(mq)}I \bigl(|X_{j}|>t^{1/(mq)}\bigr) \bigr)> \varepsilon t^{v/(mq)} \Biggr)\,dt \\ &< \infty. \end{aligned}$$
(3.4)

If \(\max\{p, mq\}< v\), let \(s=1\), by Lemma 2.8 we have

$$\begin{aligned} M_{n} \le{}& \int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum_{j=1}^{n} |X_{j}|^{v}I \bigl(|X_{j}|\le t^{1/(mq)}\bigr)> \varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ &{} +\int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} t^{v/(mq)}I\bigl(|X_{j}|>t^{1/(mq)} \bigr)> \varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ \le{}& 2 \varepsilon^{-1} \int_{n^{mq \alpha}}^{\infty}t^{-v/(mq)} \Biggl(\sum_{j=1}^{n} E |X_{j}|^{v}I\bigl(|X_{j}|\le t^{1/(mq)} \bigr) \Biggr)\,dt \\ &{}+\int_{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl( |X_{j}|>t^{1/(mq)} \bigr) \Biggr)\,dt \\ < {}& \infty. \end{aligned}$$

If \(\max\{p, mq\}\ge v\), by (1.4), the \(C_{r}\) inequality, and Lemma 2.4(i) we have

$$\begin{aligned} M_{n}& \le \int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum_{j=1}^{n} |X_{j}|^{v}> \varepsilon t^{v/(mq)} \Biggr)\,dt \\ & \le\int_{0}^{\infty}P \Biggl( \Biggl(\sum _{j=1}^{n} |X_{j}|^{v} \Biggr)^{mq/v}>\varepsilon^{mq/v} t \Biggr)\,dt = CE \Biggl(\sum _{j=1}^{n} |X_{j}|^{v} \Biggr)^{mq/v} \\ & \le \textstyle\begin{cases} C\sum_{j=1}^{n} E|X_{j}|^{mq},& \mbox{if } mq\le v ,\\ Cn^{mq/v-1}\sum_{j=1}^{n} E|X_{j}|^{mq},& \mbox{if } mq>v \end{cases}\displaystyle \\ & \le \textstyle\begin{cases} Cn E|X|^{mq},& \mbox{if } mq\le v ,\\ Cn^{mq/v} E|X|^{mq},& \mbox{if } mq>v \end{cases}\displaystyle \\ & < \infty. \end{aligned}$$

Therefore, (3.4) holds. To prove (3.3), we consider two cases.

Case 1: \(2\le v\le m\). We note

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j}^{v} \Biggr\vert >\varepsilon t^{\frac {v}{mq}} \Biggr)\,dt \\ &\quad\le \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} |X_{j}|^{v} > \varepsilon t^{\frac{v}{mq}} \Biggr)\,dt. \end{aligned}$$
(3.5)

and

$$|X_{j}|^{v}=|X_{j}|^{v}I(X_{j}< 0)+|X_{j}|^{v}I(X_{j} \ge0). $$

Therefore, without loss of generality, we assume that \(X_{j}\ge0\), \(j\ge 1\). Note that

$$X_{j}^{v}= X_{j}^{v}I \bigl(X_{j}\le t^{1/(mq)}\bigr)+t^{v/(mq)}I \bigl(X_{j}>t^{1/(mq)}\bigr)+\bigl(X_{j}^{v}-t^{v/(mq)} \bigr)I\bigl(X_{j}>t^{1/(mq)}\bigr),\quad j\ge1. $$

Denote \(Y_{j}^{(v,t)}=X_{j}^{v}I(X_{j}\le t^{1/(mq)})+t^{v/(mq)}I(X_{j}>t^{1/(mq)})\), hence, we have

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} X_{j}^{v} > \varepsilon t^{v/(mq)} \Biggr)\,dt \\ & \quad\le \sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} \bigl(X_{j}^{v}-t^{v/(mq)} \bigr)I\bigl(X_{j}>t^{1/(mq)} \bigr) >\varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ & \qquad{} +\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} Y_{j}^{(v,t)}> \varepsilon t^{v/(mq)}/2 \Biggr)\,dt \\ &\quad = I_{1}^{(v)}+I_{2}^{(v)}. \end{aligned}$$

By Lemma 2.8 we have

$$I_{1}^{(v)} \le\sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl( X_{j}>t^{1/(mq)} \bigr) \Biggr)\,dt < \infty. $$

For \(I_{2}^{(v)}\), first, we prove

$$ D_{n}:=\sup_{t\ge n^{mq\alpha}}t^{-v/(mq)}\sum _{j=1}^{n} EY_{j}^{(v,t)}\to0, \quad n\to\infty. $$
(3.6)

We consider two cases. The first one is \(p< v\). Using Lemma 2.4(i), (1.4), and \(\alpha p>1\), we get

$$ \begin{aligned}[b] D_{n}&\le\sup_{t\ge n^{mq\alpha}}t^{-v/(mq)}\sum _{j=1}^{n} t^{(v-p)/(mq)} E X_{j}^{p} \le Cn \sup_{t\ge n^{mq\alpha}}t^{-p/(mq)} E |X|^{p} \\ &\le Cn^{1-\alpha p}\to0,\quad n\to\infty. \end{aligned} $$
(3.7)

Let us now consider the second case: \(p\ge v\). Note that \(E|X|^{v}<\infty \) by (1.4), thus, we also get by Lemma 2.4(i) and \(\alpha>1/2\)

$$ D_{n} \le\sup_{t\ge n^{mq\alpha}}t^{-v/(mq)}\sum _{j=1}^{n} E X_{j}^{v} \le Cn \sup_{t\ge n^{mq\alpha}}t^{-v/(mq)} E |X|^{v} \le Cn^{1-v\alpha}\to0,\quad n\to\infty. $$
(3.8)

Therefore, (3.6) holds. In order to prove \(I_{2}^{(v)}<\infty\), by (3.4) and (3.6), it is enough to show that

$$ I_{2}^{(v*)} = \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum _{j=1}^{n} \bigl(Y_{j}^{(v,t)}-EY_{j}^{(v,t)} \bigr)>\varepsilon t^{v/(mq)}/4 \Biggr)\,dt < \infty. $$
(3.9)

Take s such that \(s>\max\{2, p,mq,2mq/p,2(\alpha p-1)/(2v\alpha-1)\} \), using the Markov inequality, Lemma 2.1, Lemma 2.2, and the Jensen inequality, we have

$$\begin{aligned} I_{2}^{(v*)} \le{}& C\sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl\{ \sum _{j=1}^{n}E \bigl(Y_{j}^{(v,t)} \bigr)^{s}+ \Biggl(\sum_{j=1}^{n}E \bigl(Y_{j}^{(v,t)} \bigr)^{2} \Biggr)^{s/2} \Biggr\} \,dt \\ \le{}& C\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl\{ \sum _{j=1}^{n} \bigl(EX_{j}^{vs}I \bigl(X_{j}\le t^{1/(mq)}\bigr)+t^{vs/(mq)} P \bigl(X_{j}>t^{1/(mq)}\bigr) \bigr) \Biggr\} \,dt \\ &{} + C\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}t^{-vs/(mq)} \Biggl\{ \sum _{j=1}^{n} \bigl(EX_{j}^{2v}I \bigl(X_{j}\le t^{1/(mq)}\bigr)\\ &{}+t^{2v/(mq)} P \bigl(X_{j}>t^{1/(mq)}\bigr) \bigr) \Biggr\} ^{s/2}\,dt \\ ={}& CI_{21}^{(v*)}+CI_{22}^{(v*)}. \end{aligned}$$

By Lemma 2.8, we get \(I_{21}^{(v*)}<\infty\). For \(I_{22}^{(v*)}\), in the case \(p< 2v\), we have by Lemma 2.4(i) and (1.4)

$$\begin{aligned} I_{22}^{(v*)} & \le \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \Biggl(\sum _{j=1}^{n} t^{(2v-p)/(mq)}E|X_{j}|^{p} \Biggr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \int_{n^{mq \alpha }}^{\infty}t^{-ps/(2mq)} \bigl(E|X|^{p} \bigr)^{s/2}\,dt \\ & = C\sum_{n=m}^{\infty}n^{\alpha p-2- (\alpha p-1)s/2 }< \infty. \end{aligned}$$

In the case \(p\ge 2v\), note that \(E|X|^{2v}<\infty\) by (1.4), thus, we get by Lemma 2.4(i)

$$\begin{aligned} I_{22}^{(v*)} & \le \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \Biggl( \sum _{j=1}^{n} E|X_{j}|^{2v} \Biggr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \int_{n^{mq \alpha }}^{\infty}t^{-vs/(mq)} \bigl(E|X|^{2v} \bigr)^{s/2}\,dt \le C\sum_{n=m}^{\infty}n^{\alpha p-2-(v\alpha-1/2)s }< \infty. \end{aligned}$$

Therefore, \(I_{2}^{(v*)}<\infty\), hence

$$\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} X_{j}^{v} > \varepsilon t^{v/(mq)} \Biggr)\,dt< \infty. $$

Thus (3.3) holds by (3.5) in the case \(2\le v\le m\).

Case 2: \(v=1\). We choose γ such that \(1/(\alpha p)<\gamma<1\). For \(\forall j\ge1\), \(t>0\), let

$$\begin{aligned} &X_{j}^{(t,1)}=-t^{\gamma/(mq)}I\bigl(X_{j}< -t^{\gamma/(mq)} \bigr)+X_{j} I\bigl(|X_{j}|\le t^{\gamma/(mq)} \bigr)+t^{\gamma/(mq)}I\bigl(X_{j}>t^{\gamma/(mq)}\bigr), \\ &X_{j}^{(t,2)}=\bigl(X_{j}-t^{\gamma/(mq)} \bigr)I\bigl(t^{\gamma/(mq)}< X_{j}\le t^{1/(mq)} \bigr)+t^{1/(mq)}I\bigl(X_{j}>t^{1/(mq)}\bigr), \\ &X_{j}^{(t,3)}=\bigl(X_{j}-t^{\gamma/(mq)}-t^{1/(mq)} \bigr)I\bigl(X_{j}>t^{1/(mq)}\bigr), \\ &X_{j}^{(t,4)}=\bigl(X_{j}+t^{\gamma/q} \bigr)I\bigl(-t^{1/(mq)}\le X_{j}< -t^{\gamma /(mq)} \bigr)-t^{1/(mq)}I\bigl(X_{j}< -t^{1/(mq)}\bigr), \\ &X_{j}^{(t,5)}=\bigl(X_{j}+t^{\gamma/(mq)}+t^{1/(mq)} \bigr)I\bigl(X_{j}< -t^{1/(mq)}\bigr). \ \end{aligned}$$

Hence \(X_{j}=\sum_{l=1}^{5} X_{j}^{(t,l)}\), note that

$$\begin{aligned} &\sum_{n=m}^{\infty}h(n)\int _{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j} \Biggr\vert >\varepsilon t^{1/(mq)} \Biggr)\,dt \\ &\quad \le\sum_{l=1}^{5} \sum _{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl(\max_{m\le k\le n}\Biggl\vert \sum _{j=1}^{k}X_{j}^{(t,l)}\Biggr\vert >\varepsilon t^{1/(mq)}/5 \Biggr)\,dt = \sum_{l=1}^{5} J_{l}. \end{aligned}$$

In order to prove (3.3) in the case \(v=1\), it suffices to show that \(J_{l}<\infty\) for \(1\le l\le5\).

For \(J_{1}\), we first prove that for any fixed positive integer \(n\ge m\),

$$ \int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} t^{\gamma /(mq)}I\bigl(|X_{j}|>t^{\gamma/(mq)} \bigr) > \varepsilon t^{1/(mq)} \Biggr)\,dt< \infty \quad\mbox{for all } \varepsilon>0. $$
(3.10)

Chose \(n_{0}>n\) such that \(nn_{0}^{(\gamma-1)\alpha}<\varepsilon/2\), hence, by Lemma 2.8 we have

$$\begin{aligned} &\int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} t^{\gamma /(mq)}I\bigl(|X_{j}|>t^{\gamma/(mq)} \bigr) > \varepsilon t^{1/(mq)} \Biggr)\,dt \\ &\quad \le \int_{n^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} \bigl( t^{(\gamma -1)/(mq)}I \bigl(t^{\gamma/(mq)}< |X_{j}|\le t^{1/(mq)}\bigr) +I \bigl(|X_{j}|> t^{1/(mq)}\bigr) \bigr)> \varepsilon \Biggr)\,dt \\ &\quad \le \int_{n^{mq \alpha}}^{n_{0}^{mq \alpha}}\,dt +\int _{n_{0}^{mq \alpha }}^{\infty}P \Biggl(nn_{0}^{(\gamma-1)\alpha}+ \sum_{j=1}^{n} I\bigl(|X_{j}|> t^{1/(mq)}\bigr) > \varepsilon \Biggr)\,dt \\ &\quad \le C +\int_{n_{0}^{mq \alpha}}^{\infty}P \Biggl(\sum _{j=1}^{n} I\bigl(|X_{j}|> t^{1/(mq)}\bigr) > \varepsilon/2 \Biggr)\,dt \\ &\quad \le C +\int_{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl(|X_{j}|> t^{1/(mq)}\bigr) \Biggr)\,dt \\ &\quad < \infty. \end{aligned}$$

Therefore, (3.10) holds. Now we prove that

$$ E_{n}:=\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\max _{1\le k\le n}\Biggl\vert \sum_{j=1}^{k}EX_{j}^{(t,1)} \Biggr\vert \to0,\quad n\to\infty. $$
(3.11)

We consider three cases. In the case \(\alpha\le 1\), since \(\alpha p>1\) we get \(p>1\). Thus, by \(EX_{j}=0\), \(j\ge1\), Lemma 2.4(iii), and \(1/(\alpha p)<\gamma<1\), we have

$$\begin{aligned} E_{n} & \le \sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n} \bigl(E|X_{j}|I \bigl(|X_{j}|>t^{\gamma/(mq)}\bigr)+t^{\gamma/(mq)}I \bigl(|X_{j}|>t^{\gamma /(mq)}\bigr) \bigr) \\ & \le 2\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n}E|X_{j}|I\bigl(|X_{j}|>t^{\gamma/(mq)} \bigr)\le Cn\sup_{t\ge n^{mq \alpha }}t^{-1/(mq)}E|X|I\bigl(|X|> t^{\gamma/(mq)}\bigr) \\ & \le Cn^{1-\alpha}E|X| I\bigl(|X|> n^{\alpha\gamma}\bigr) \le Cn^{1-\alpha p \gamma-\alpha(1-\gamma)}E|X|^{p}\to0, \quad n\to\infty. \end{aligned}$$

In the case \(\alpha>1\) and \(p< 1\), utilizing a similar method to the proof of (3.7) we also obtain

$$E_{n} \le Cn ^{1-\alpha p \gamma-(1-\gamma)\alpha}\to0,\quad n\to\infty. $$

In the case \(\alpha>1\) and \(p\ge1\), utilizing a similar method to the proof of (3.8) we have

$$E_{n}\le Cn^{1-\alpha}\to0, \quad n\to\infty. $$

Hence, (3.11) holds. In order to prove that \(J_{1}<\infty\), by (3.4), (3.10), and (3.11), it is enough to prove that

$$J_{1}^{*} = \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl(\max _{m\le k\le n} \Biggl\vert \sum_{j=1}^{k} X_{j}^{(t,1)}-EX_{j}^{(t,1)}\Biggr\vert > \varepsilon t^{1/(mq)}/10 \Biggr)\,dt < \infty. $$

Take s such that \(s>\max\{2, \frac{mq}{1-\gamma},\frac{p}{1-\gamma },\frac{2mq}{2(1-\gamma)+p\gamma},\frac{2(\alpha p-1)}{2\alpha(1-\gamma )+\alpha p\gamma-1},\frac{2(\alpha p-1)}{2\alpha-1}\}\), by the Markov inequality, Lemma 2.1, Lemma 2.3, and the Jensen inequality, we have

$$\begin{aligned} J_{1}^{*} \le{}& C\sum_{n=m}^{\infty}h(n) \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{-s/(mq)} \Biggl\{ \sum_{j=1}^{n}E \bigl\vert X_{j}^{(t,1)}\bigr\vert ^{s}+ \Biggl( \sum_{j=1}^{n}E \bigl(X_{j}^{(t,1)} \bigr)^{2} \Biggr)^{s/2} \Biggr\} \,dt \\ \le{}& C\sum_{n=m}^{\infty}h(n) \bigl( \log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{\frac{-s}{mq}} \Biggl\{ \sum_{j=1}^{n} \bigl(E|X_{j}|^{s}I\bigl(|X_{j}|\le t^{\frac{\gamma}{mq}}\bigr)+ t^{\frac{\gamma s}{mq}} P\bigl(|X_{j}|>t^{\frac{\gamma}{mq}} \bigr) \bigr) \Biggr\} \,dt \\ &{} + C\sum_{n=m}^{\infty}h(n) \bigl( \log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{\frac{-s}{mq}} \Biggl\{ \sum_{j=1}^{n} \bigl(E|X_{j}|^{2}I\bigl(|X_{j}|\le t^{\frac{\gamma}{mq}}\bigr)\\ &{}+ t^{\frac{2\gamma}{mq}} P\bigl(|X_{j}|>t^{\frac{\gamma}{mq}} \bigr) \bigr) \Biggr\} ^{\frac{s}{2}}\,dt \\ ={}& CJ_{11}^{*} +CJ_{12}^{*} . \end{aligned}$$

By Lemma 2.7, we have \(J_{11}^{*} < \infty\). For \(J_{12}^{*}\), in the case \(0< p< 2\) we have by Lemma 2.4(i) and (1.4)

$$\begin{aligned} J_{12}^{*} & \le \sum_{n=m}^{\infty}h(n) \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha }}^{\infty}t^{-s/(mq)} \Biggl(\sum_{j=1}^{n} t^{(2-p)\gamma/(mq)}E |X_{j}|^{p} \Biggr)^{s/2}\,dt \\ &\le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha}}^{\infty}t^{-[2-(2-p)\gamma]s/(2mq)} \bigl(E|X|^{p} \bigr)^{s/2}\,dt \\ &\le C\sum_{n=m}^{\infty}n^{\alpha p -2 -[\alpha(1-\gamma)+(\alpha p\gamma-1)/2] s} \bigl(\log(4n)\bigr)^{s} < \infty. \end{aligned}$$

In the case \(p\ge 2\), note that \(E|X|^{2}<\infty\) in this case, we have by Lemma 2.4(i)

$$\begin{aligned} J_{12}^{*} & \le \sum_{n=m}^{\infty}h(n) \bigl(\log(4n)\bigr)^{s}\int_{n^{m q \alpha }}^{\infty}t^{-s/(mq)} \Biggl(\sum_{j=1}^{n}EX_{j}^{2} \Biggr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha( p-mq)-2+s/2 } \bigl(\log(4n)\bigr)^{s} \int_{n^{mq \alpha}}^{\infty}t^{-s/(mq)} \bigl(E|X|^{2} \bigr)^{s/2}\,dt \\ & \le C\sum_{n=m}^{\infty}n^{\alpha p-2-(\alpha-1/2) s } \bigl(\log(4n)\bigr)^{s} < \infty. \end{aligned}$$

Thus, \(J_{1}<\infty\).

For \(J_{2}\), we first prove

$$ F_{n}:= \sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n}EX_{j}^{(t,2)} \to0, \quad n \to\infty. $$
(3.12)

We consider two cases. In the case \(p\ge1\), we have by Lemma 2.4(iii)

$$\begin{aligned} 0 & \le F_{n}\le \sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n} \bigl\{ EX_{j}I \bigl(X_{j}>t^{\gamma/(mq)}\bigr)+t^{1/(mq)}P \bigl(X_{j}>t^{1/(mq)}\bigr) \bigr\} \\ & \le 2\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}\sum _{j=1}^{n}EX_{j} I\bigl(X_{j}>t^{\gamma/(mq)} \bigr) \\ & \le Cn\sup_{t\ge n^{mq \alpha}}t^{-1/(mq)}E|X|I\bigl(|X|>t^{\gamma /(mq)} \bigr) \\ & \le Cn^{1-\alpha p\gamma-\alpha(1-\gamma)} E|X|^{p} \to0,\quad n\to \infty. \end{aligned}$$

In the case \(0< p<1\), utilizing a similar method to the proof of (3.7) we have

$$0\le F_{n} \le Cn^{1-\alpha p } \to0,\quad n\to\infty. $$

Therefore, (3.12) holds. By (3.4) and (3.12), in order to prove \(J_{2}<\infty\), it is enough to show that

$$J_{2}^{*} = \sum_{n=m}^{\infty}h(n)\int_{n^{mq \alpha}}^{\infty}P \Biggl( \sum _{j=1}^{n} \bigl(X_{j}^{(t,2)}-EX_{j}^{(t,2)} \bigr)>\varepsilon t^{1/(mq)}/10 \Biggr)\,dt < \infty. $$

Take s such that \(s>\max\{2, p,mq,2mq/p,(\alpha p-1)/(\alpha-1/2)\}\), utilizing a similar method to the proof of (3.9) we also have \(J_{2}^{*} <\infty\), hence \(J_{2}<\infty\).

For \(J_{3}\), we get by Lemma 2.8

$$\begin{aligned} J_{3} & \le \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int_{n^{mq \alpha }}^{\infty}P \Biggl( \bigcup_{j=1}^{n}\bigl(X_{j}^{(t,3)}\neq0 \bigr) \Biggr)\,dt \\ & = \sum_{n=m}^{\infty}n^{\alpha( p-mq)-2 }\int _{n^{mq \alpha}}^{\infty}\Biggl(\sum _{j=1}^{n} P\bigl( X_{j}>t^{1/(mq)} \bigr) \Biggr)\,dt \\ & < \infty. \end{aligned}$$

Similar to the proof of \(J_{2}<\infty\) we have \(J_{4}<\infty\). Similar to the proof of \(J_{3}<\infty\) we have \(J_{5}<\infty\). Therefore, (1.5) holds.

Similar to the proof of (1.3), we obtain (1.6) by (1.5). □