1 Introduction

Let \(\{X_{ni},1\leq i\leq n,n\geq1\}\) be an array of rowwise random variables defined on a fixed probability space \((\Omega,\mathcal{F},P)\) and \(\{b_{ni},1\leq i\leq n,n\geq1\}\) be an array of real numbers. As we know, the limiting behavior for the maximum of weighted sums \(\max_{1\leq m\leq n}\sum_{i=1}^{m} b_{ni}X_{ni}\) is very useful in many probabilistic derivations and stochastic models. There exist several versions available in the literature for independent random variables with assumption of control on their moments. If the independent case is classical in the literature, the treatment of dependent variables is more recent.

One of the dependence structures that has attracted the interest of probabilists and statisticians is negative association. The concept of negatively associated random variables was introduced by Alam and Saxena [1] and carefully studied by Joag-Dev and Proschan [2].

A finite family of random variables \(\{X_{i},1\leq i\leq n\}\) is said to be negatively associated (NA, in short) if for every pair of disjoint subsets \(A,B \subset\{1,2,\ldots, n\}\),

$$\operatorname{Cov}\bigl(f(X_{i},i\in A),g(X_{j},j\in B) \bigr)\leq0, $$

whenever f and g are coordinatewise nondecreasing such that this covariance exists. An infinite family of random variables is negatively associated if every finite subfamily is negatively associated.

The next dependence notion is negatively superadditive dependence, which is weaker than negative association. The concept of negatively superadditive dependent random variables was introduced by Hu [3] as follows.

Definition 1.1

(cf. Kemperman [4])

A function \(\phi :\mathbb{R}^{n}\rightarrow\mathbb{R}\) is called superadditive if \(\phi(\mathbf{x} \vee\mathbf{y})+\phi(\mathbf{x} \wedge\mathbf{y})\geq\phi(\mathbf{x})+\phi(\mathbf{y})\) for all \(\mathbf{x},\mathbf{y}\in R^{n}\), where ∨ stands for componentwise maximum and ∧ stands for componentwise minimum.

Definition 1.2

(cf. Hu [3])

A random vector \(\mathbf{X}=(X_{1},X_{2},\ldots,X_{n})\) is said to be negatively superadditive dependent (NSD) if

$$ E\phi(X_{1},X_{2},\ldots,X_{n})\leq E\phi\bigl(X_{1}^{\ast},X_{2}^{\ast}, \ldots,X_{n}^{\ast}\bigr), $$
(1.1)

where \(X_{1}^{\ast},X_{2}^{\ast},\ldots,X_{n}^{\ast}\) are independent such that \(X_{i}^{\ast}\) and \(X_{i}\) have the same distribution for each i and ϕ is a superadditive function such that the expectations in (1.1) exist.

A sequence \(\{X_{n},n\geq1\}\) of random variables is said to be NSD if for all \(n\geq1\), \((X_{1},X_{2}, \ldots,X_{n})\) is NSD.

An array \(\{X_{ni}, i\geq1, n\geq1\}\) of random variables is said to be rowwise NSD if for all \(n\geq1\), \(\{X_{ni}, i\geq1\}\) is NSD.

The concept of NSD random variables was introduced by Hu [3], which was based on the class of superadditive functions. Hu [3] gave an example illustrating that NSD does not imply NA, and he posed an open problem whether NA implies NSD. In addition, Hu [3] provided some basic properties and three structural theorems of NSD. Christofides and Vaggelatou [5] solved this open problem and indicated that NA implies NSD. NSD structure is an extension of negatively associated structure and sometimes more useful than it and can be used to get many important probability inequalities. Eghbal et al. [6] derived two maximal inequalities and strong law of large numbers of quadratic forms of NSD random variables under the assumption that \(\{X_{i},i\geq1\}\) is a sequence of nonnegative NSD random variables with \(EX_{i}^{r}<\infty\) for all \(i\geq1\) and some \(r>1\). Shen et al. [7] established the strong limit theorems for NSD random variables. Wang et al. [8] investigated the complete convergence for arrays of rowwise NSD random variables and gave its applications to nonparametric regression model. Wang et al. [9] obtained the complete convergence for weighted sums of NSD random variables and its application in the EV regression model. The main purpose of this work is to further study the complete convergence for weighted sums of arrays of rowwise NSD random variables without identical distribution, which generalizes and improves some known results of random variables.

Definition 1.3

A sequence of random variables \(\{U_{n},n\geq1\}\) is said to converge completely to a constant a if for any \(\varepsilon>0\),

$$\sum_{n=1}^{\infty}P\bigl(\vert U_{n}-a\vert >\varepsilon\bigr) < \infty. $$

In this case, we write \(U_{n}\rightarrow a\) completely. This notion was given first by Hsu and Robbins [10].

Definition 1.4

Let \(\{Z_{n},n\geq1\}\) be a sequence of random variables and \(a_{n}>0\), \(b_{n}>0\), \(q>0\). If

$$\sum_{n=1}^{\infty}a_{n} E\bigl\{ b_{n}^{-1}|Z_{n}|-\varepsilon \bigr\} _{+}^{q} < \infty \quad \mbox{for all } \varepsilon>0, $$

then the above result was called the complete moment convergence by Chow [11].

Let \(\{X_{nk},k\geq1,n\geq1\}\) be a sequence of NSD random variables, \(\{a_{n},n\geq1\}\) be a sequence of positive real numbers such that \(a_{n}\uparrow\infty\) and \(\{\Psi_{k}(t), k\geq1\}\) be a sequence of positive even functions such that

$$ \frac{\Psi_{k}(|t|)}{|t|^{q}}\uparrow \quad \mbox{and} \quad \frac{\Psi _{k}(|t|)}{|t|^{p}} \downarrow \quad \mbox{as } |t|\uparrow $$
(1.2)

for some \(1\leq q< p \) and each \(k\geq1\). In order to prove our results, we mention the following conditions:

$$\begin{aligned}& EX_{nk}=0, \quad k\geq1, n\geq1, \end{aligned}$$
(1.3)
$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{k=1}^{n} E\frac{\Psi_{k}(X_{nk})}{\Psi _{k}(a_{n})}< \infty, \end{aligned}$$
(1.4)
$$\begin{aligned}& \sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n} E \biggl(\frac{X_{nk}}{a_{n}} \biggr)^{2} \Biggr)^{v/2}< \infty, \end{aligned}$$
(1.5)

where \(v\geq p\) is a positive integer.

The following examples of function \(\Psi_{k}(t)\) satisfying assumption (1.2): \(\Psi_{k}(t)=|t|^{\beta}\) for some \(q<\beta<p\) or \(\Psi_{k}(t)=|t|^{q} \log(1+|t|^{p-q})\) for \(t\in(-\infty,+\infty)\). Note that these functions are nonmonotone on \(t\in(-\infty,+\infty)\), while it is simple to show that, under condition (1.2), the function \(\Psi_{k}(t)\) is an increasing function for \(t>0\). In fact, \(\Psi_{k}(t)=\frac{\Psi_{k}(t)}{|t|^{q}}\cdot |t|^{q}\), \(t>0\), and \(|t|^{q}\uparrow\) as \(|t|\uparrow\), then we have \(\Psi_{k}(t)\uparrow\).

Recently Shen et al. [7] obtained the following complete convergence for weighted sums of NSD random variables.

Theorem A

Let \(\{X_{n},n\geq1\}\) be a sequence of NSD random variables. Assume that \(\{g_{n}(x),n\geq1\}\) is a sequence of even functions defined of ℝ, positive and nondecreasing on the half-line \(x>0\). Suppose that one or the other of the following conditions is satisfied for every \(n\geq1\):

  1. (i)

    for some \(0< r\leq1\), \(x^{r}/g_{n}(x)\) is a nondecreasing function of x on the half-line \(x>0\);

  2. (ii)

    for some \(1< r\leq2\), \(x/g_{n}(x)\) and \(g_{n}(x)/x^{r}\) are nonincreasing functions of x on the half-line \(x>0\), \(EX_{n}=0\).

For any positive sequence \(\{a_{n},n\geq1\}\) with \(a_{n}\uparrow\infty\), if we assume that

$$ \sum_{n=1}^{\infty}\frac{Eg_{n}(X_{n})}{g_{n}(a_{n})}< \infty, $$
(1.6)

then \(\sum_{n=1}^{\infty}\frac{X_{n}}{a_{n}}\) converges almost surely and therefore \(\lim_{n\rightarrow\infty}\frac{1}{a_{n}}\sum_{i=1}^{n} X_{i}=0\), a.s.

For more details about this type of complete convergence, one can refer to Wu [12, 13], Gan and Chen [14], Yang [15], Shao [16], Wu [17, 18], Chen and Sung [19], and so on. The purpose of this paper is extending Theorem A to the complete moment convergence, which is a more general version of the complete convergence. In this work, the symbol C always stands for a generic positive constant, which may vary from one place to another.

2 Preliminary lemmas

In this section, we give the following lemmas which will be used to prove our main results.

Lemma 2.1

(cf. Hu [3])

If \((X_{1},X_{2},\ldots,X_{n})\) is NSD and \(g_{1},g_{2},\ldots,g_{n}\) are nondecreasing functions, then \((g_{1}(X_{1}),g_{2}(X_{2}),\ldots,g_{n}(X_{n}))\) is NSD.

Lemma 2.2

(cf. Wang et al. [8])

Let \(p>1\). Let \(\{X_{n},n\geq1\}\) be a sequence of NSD random variables with \(E|X_{i}|^{p}<\infty\) for each \(i\geq1\). Then, for all \(n\geq1\),

$$ E \Biggl(\max_{1\leq k \leq n}\Biggl\vert \sum _{i=1}^{k}X_{i}\Biggr\vert ^{p} \Biggr)\leq 2^{3-p}\sum_{i=1}^{n}E \vert X_{i}\vert ^{p}\quad \textit{for } 1< p\leq2 $$
(2.1)

and

$$ E \Biggl(\max_{1\leq k \leq n}\Biggl\vert \sum _{i=1}^{k}X_{i}\Biggr\vert ^{p} \Biggr)\leq2 \biggl(\frac{15p}{\ln p} \biggr)^{p} \Biggl[\sum _{i=1}^{n}E\vert X_{i}\vert ^{p}+ \Biggl(\sum_{i=1}^{n}E X_{i}^{2} \Biggr)^{p/2} \Biggr] \quad \textit{for } p>2. $$
(2.2)

Lemma 2.3

Let \(\{X_{nk},k\geq1, n\geq1\}\) be a sequence of NSD random variables, and let \(\{a_{n},n\geq1\}\) be a sequence of positive real numbers such that \(a_{n}\uparrow\infty\). Also, let \(\{\Psi _{k}(t),k\geq1\}\) be a positive even function satisfying (1.2) for \(1\leq q< p\). Then (1.4) implies the following statements:

  1. (i)

    for \(r\geq1\), \(0< u\leq q\),

    $$ \sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}\frac {E|X_{nk}|^{u}I(|X_{nk}|>a_{n})}{{a^{u}_{n}}} \Biggr)^{r}< \infty; $$
    (2.3)
  2. (ii)

    for \(v\geq p\),

    $$ \sum_{n=1}^{\infty}\sum _{k=1}^{n}\frac{E|X_{nk}|^{v}I(|X_{nk}|\leq a_{n})}{{a^{v}_{n}}}< \infty. $$
    (2.4)

Proof

From (1.2) and (1.4), we get

$$\begin{aligned}& \sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}\frac {E|X_{nk}|^{u}I(|X_{nk}|>a_{n})}{{a^{u}_{n}}} \Biggr)^{r} \\& \quad \leq \sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}\frac {E|X_{nk}|^{q}I(|X_{nk}|>a_{n})}{{a^{q}_{n}}} \Biggr)^{r} \\& \quad \leq \sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}E\frac{\Psi_{k}(X_{nk})}{\Psi _{k}(a_{n})} \Biggr)^{r} \\& \quad \leq \Biggl(\sum_{n=1}^{\infty}\sum _{k=1}^{n}E\frac{\Psi_{k}(X_{nk})}{\Psi _{k}(a_{n})} \Biggr)^{r}< \infty \end{aligned}$$

and

$$\begin{aligned}& \sum_{n=1}^{\infty}\sum _{k=1}^{n}\frac{E|X_{nk}|^{v}I(|X_{nk}|\leq a_{n})}{{a^{v}_{n}}} \\& \quad \leq \sum_{n=1}^{\infty}\sum _{k=1}^{n}\frac{E|X_{nk}|^{p}I(|X_{nk}|\leq a_{n})}{{a^{p}_{n}}} \\& \quad \leq \sum_{n=1}^{\infty}\sum _{k=1}^{n}E\frac{\Psi_{k}(X_{nk})}{\Psi _{k}(a_{n})}< \infty, \end{aligned}$$

where \(r\geq1\), \(0< u\leq q\) and \(v\geq p\). The proof is complete. □

3 Main results and their proofs

Theorem 3.1

Let \(\{X_{nk},k\geq1, n\geq1\}\) be a sequence of NSD random variables, and let \(\{a_{n},n\geq1\}\) be a sequence of positive real numbers such that \(a_{n}\uparrow\infty\). Also, let \(\{\Psi _{k}(t),k\geq1\}\) be a positive even function satisfying (1.2) for \(1\leq q< p\leq2\). Then, under conditions (1.3) and (1.4), we have

$$ \sum_{n=1}^{\infty}a^{-q}_{n} E \Biggl\{ \max_{1\leq j\leq n} \Biggl\vert \sum_{k=1}^{j} X_{nk} \Biggr\vert -\varepsilon a_{n} \Biggr\} _{+}^{q}< \infty, \quad \forall \varepsilon>0. $$
(3.1)

Proof

For \(n\geq1\), denote \(M_{n}(X)= \max_{1\leq j\leq n}\vert \sum_{k=1}^{j} X_{nk}\vert \). It is easy to check that

$$\begin{aligned}& \sum_{n=1}^{\infty}{a^{-q}_{n}}E \bigl\{ M_{n}(X)-\varepsilon a_{n}\bigr\} _{+}^{q} \\& \quad = \sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{0}^{\infty}P\bigl\{ M_{n}(X)- \varepsilon a_{n}>t^{1/q}\bigr\} \,dt \\& \quad = \sum_{n=1}^{\infty}{a^{-q}_{n}} \biggl(\int_{0}^{{a^{q}_{n}}}P\bigl\{ M_{n}(X)> \varepsilon a_{n}+t^{1/q}\bigr\} \,dt+ \int_{{a^{q}_{n}}}^{\infty}P\bigl\{ M_{n}(X)>\varepsilon a_{n}+t^{1/q}\bigr\} \,dt \biggr) \\& \quad \leq \sum_{n=1}^{\infty}P\bigl\{ M_{n}(X)>\varepsilon a_{n}\bigr\} + \sum _{n=1}^{\infty}{a^{-q}_{n}}\int _{{a^{q}_{n}}}^{\infty}P\bigl\{ M_{n}(X)>t^{1/q} \bigr\} \,dt\doteq I_{1}+I_{2}. \end{aligned}$$

To prove (3.1), it suffices to prove that \(I_{1}<\infty\) and \(I_{2}<\infty \). Now let us prove them step by step. Firstly, we prove that \(I_{1}<\infty\).

For all \(n\geq1\), define

$$X^{(n)}_{k}=X_{nk}I\bigl(\vert X_{nk} \vert \leq a_{n}\bigr), \qquad T^{(n)}_{j}= \frac{1}{a_{n}}\sum_{k=1}^{j} \bigl(X^{(n)}_{k}-EX^{(n)}_{k}\bigr), $$

then for all \(\varepsilon>0\), it is easy to have

$$\begin{aligned} \begin{aligned}[b] &P \Biggl(\max_{1\leq j\leq n}\Biggl\vert \frac{1}{a_{n}}\sum_{k=1}^{j}X_{nk} \Biggr\vert >\varepsilon \Biggr) \\ &\quad \leq P \Bigl(\max_{1\leq j\leq n}|X_{nk}|>a_{n} \Bigr)+P \Biggl(\max_{1\leq j\leq n}\bigl\vert T^{(n)}_{j} \bigr\vert >\varepsilon-\max_{1\leq j\leq n}\Biggl\vert \frac{1}{a_{n}}\sum_{k=1}^{j}EX^{(n)}_{k} \Biggr\vert \Biggr). \end{aligned} \end{aligned}$$
(3.2)

By (1.2), (1.3), (1.4) and Lemma 2.3, we have

$$\begin{aligned}& \max_{1\leq j\leq n}\Biggl\vert \frac{1}{a_{n}}\sum _{k=1}^{j}EX^{(n)}_{k} \Biggr\vert \\& \quad = \max_{1\leq j\leq n}\Biggl\vert \frac{1}{a_{n}}\sum _{k=1}^{j}EX_{nk}I\bigl(\vert X_{nk}\vert \leq a_{n}\bigr)\Biggr\vert \\& \quad = \max_{1\leq j\leq n}\Biggl\vert \frac{1}{a_{n}}\sum _{k=1}^{j}EX_{nk}I\bigl(\vert X_{nk}\vert >a_{n}\bigr) \Biggr\vert \\& \quad \leq \sum_{k=1}^{n}\frac{E|X_{nk}|I(\vert X_{nk}\vert >a_{n})}{a_{n}} \rightarrow 0 \quad \mbox{as } n\rightarrow\infty. \end{aligned}$$
(3.3)

From (3.2) and (3.3), it follows that for n large enough,

$$ P \Biggl(\max_{1\leq j\leq n}\Biggl\vert \frac{1}{a_{n}}\sum _{k=1}^{j}X_{nk}\Biggr\vert > \varepsilon \Biggr) \leq\sum_{k=1}^{n}P\bigl( \vert X_{nk}\vert >a_{n}\bigr)+P \biggl(\max _{1\leq j\leq n}\bigl\vert T^{(n)}_{j}\bigr\vert > \frac{\varepsilon}{2} \biggr). $$

Hence we only need to prove that

$$\begin{aligned}& I\doteq\sum_{n=1}^{\infty}\sum _{k=1}^{n}P\bigl(\vert X_{nk} \vert >a_{n}\bigr)< \infty, \end{aligned}$$
(3.4)
$$\begin{aligned}& \mathit{II}\doteq\sum_{n=1}^{\infty}P \biggl( \max_{1\leq j\leq n}\bigl\vert T^{(n)}_{j}\bigr\vert >\frac{\varepsilon}{2} \biggr)< \infty. \end{aligned}$$
(3.5)

For I, it follows by Lemma 2.3 that

$$I=\sum_{n=1}^{\infty}\sum _{k=1}^{n}EI\bigl(\vert X_{nk}\vert >a_{n}\bigr) \leq\sum_{n=1}^{\infty}\sum_{k=1}^{n}\frac {E|X_{nk}|^{q}I(|X_{nk}|>a_{n})}{{a^{q}_{n}}}< \infty. $$

For II, taking \(r\geq2\). Since \(p\leq2\), \(r\geq p\), we have by Markov’s inequality, Lemma 2.1, \(C_{r}\)-inequality, and Lemma 2.3 that

$$\begin{aligned} \mathit{II} \leq&\sum_{n=1}^{\infty}\biggl( \frac{\varepsilon}{2} \biggr)^{-r}E\max_{1\leq j\leq n}\bigl\vert T^{(n)}_{j}\bigr\vert ^{r} \\ \leq&C\sum_{n=1}^{\infty}\biggl( \frac{\varepsilon}{2} \biggr)^{-r}\frac {1}{{a^{r}_{n}}} \Biggl[\sum _{k=1}^{n}E\bigl\vert X^{(n)}_{k} \bigr\vert ^{r} + \Biggl(\sum_{k=1}^{n}E \bigl\vert X^{(n)}_{k}\bigr\vert ^{2} \Biggr)^{r/2} \Biggr] \\ \leq&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}\frac{E|X^{(n)}_{k}|^{r}}{a^{r}_{n}} +C\sum _{n=1}^{\infty}\Biggl(\sum_{k=1}^{n} \frac {E|X^{(n)}_{k}|^{2}}{a^{2}_{n}} \Biggr)^{r/2} \\ \leq&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}\frac{E|X_{nk}|^{p}I(|X_{nk}|\leq a_{n})}{a^{p}_{n}} +C\sum _{n=1}^{\infty}\Biggl(\sum_{k=1}^{n} \frac{E|X_{nk}|^{p}I(|X_{nk}|\leq a_{n})}{a^{p}_{n}} \Biggr)^{r/2} \\ \leq&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}\frac{E|X_{nk}|^{p}I(|X_{nk}|\leq a_{n})}{a^{p}_{n}} +C \Biggl(\sum _{n=1}^{\infty}\sum_{k=1}^{n} \frac{E|X_{nk}|^{p}I(|X_{nk}|\leq a_{n})}{a^{p}_{n}} \Biggr)^{r/2}< \infty . \end{aligned}$$

Next we prove that \(I_{2}<\infty\). Denote \(Y_{nk}=X_{nk}I(|X_{nk}|\leq t^{1/q})\), \(Z_{nk}=X_{nk}-Y_{nk}\), and \(M_{n}(Y)=\max_{1\leq j\leq n}|\sum_{k=1}^{j}Y_{nk}|\). Obviously,

$$P\bigl\{ M_{n}(X)>t^{1/q}\bigr\} \leq\sum _{k=1}^{n}P\bigl\{ |X_{nk}|>t^{1/q} \bigr\} +P\bigl\{ M_{n}(Y)>t^{1/q}\bigr\} . $$

Hence,

$$\begin{aligned} I_{2} \leq&\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}P\bigl\{ |X_{nk}|>t^{1/q} \bigr\} \, dt +\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}P\bigl\{ M_{n}(Y)>t^{1/q} \bigr\} \, dt \\ \doteq& I_{3}+I_{4}. \end{aligned}$$

For \(I_{3}\), by Lemma 2.3, we have

$$\begin{aligned} I_{3} =&\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}P\bigl\{ |X_{nk}|I\bigl( \vert X_{nk}\vert >a_{n}\bigr)>t^{1/q}\bigr\} \, dt \\ \leq&\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}}\int _{0}^{\infty}P\bigl\{ |X_{nk}|I\bigl(\vert X_{nk}\vert >a_{n}\bigr)>t^{1/q}\bigr\} \, dt \\ =&\sum_{n=1}^{\infty}\sum _{k=1}^{n} \frac {E|X_{nk}|^{q}I(\vert X_{nk}\vert >a_{n})}{{a^{q}_{n}}}< \infty. \end{aligned}$$

Now let us prove that \(I_{4}<\infty\). Firstly, it follows by (1.3) and Lemma 2.3 that

$$\begin{aligned}& \max_{t\geq{a^{q}_{n}}} \max_{1\leq j\leq n}t^{-1/q}\Biggl\vert \sum_{k=1}^{j}EY_{nk} \Biggr\vert \\& \quad = \max_{t\geq{a^{q}_{n}}} \max_{1\leq j\leq n}t^{-1/q} \Biggl\vert \sum_{k=1}^{j}EZ_{nk} \Biggr\vert \\& \quad \leq \max_{t\geq{a^{q}_{n}}} t^{-1/q}\sum _{k=1}^{n}E|X_{nk}|I\bigl(\vert X_{nk}\vert >t^{1/q}\bigr) \\& \quad \leq \sum_{k=1}^{n} {a^{-1}_{n}}E|X_{nk}|I\bigl(\vert X_{nk}\vert >a_{n}\bigr) \\& \quad \leq \sum_{k=1}^{n} \frac{E|X_{nk}|^{q} I(\vert X_{nk}\vert >a_{n})}{{a^{q}_{n}}}\rightarrow0 \quad \mbox{as } n\rightarrow\infty. \end{aligned}$$

Therefore, for n sufficiently large,

$$\max_{1\leq j\leq n} \Biggl\vert \sum_{k=1}^{j}EY_{nk} \Biggr\vert \leq \frac{t^{1/q}}{2} , \quad t\geq{a^{q}_{n}}. $$

Then, for n sufficiently large,

$$ P\bigl\{ M_{n}(Y)>t^{1/q}\bigr\} \leq P \Biggl\{ \max_{1\leq j\leq n}\Biggl\vert \sum_{k=1}^{j}(Y_{nk}-EY_{nk}) \Biggr\vert >\frac {t^{1/q}}{2} \Biggr\} , \quad t\geq{a^{q}_{n}}. $$
(3.6)

Let \(d_{n}=[a_{n}]+1\). By (3.6), Lemma 2.1, and \(C_{r}\)-inequality, we can see that

$$\begin{aligned} I_{4} \leq&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-2/q}E\Biggl(\max _{1\leq j\leq n}\Biggl\vert \sum_{k=1}^{j}(Y_{nk}-E Y_{nk})\Biggr\vert \Biggr)^{2} \,dt \\ \leq& C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-2/q}\sum _{k=1}^{n}E(Y_{nk}-E Y_{nk})^{2} \,dt \\ \leq& C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}}\int _{{a^{q}_{n}}}^{\infty}t^{-2/q}E {Y^{2}_{nk}} \,dt \\ =&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}}\int _{{a^{q}_{n}}}^{\infty}t^{-2/q}E {X^{2}_{nk}}I \bigl(\vert X_{nk}\vert \leq d_{n}\bigr) \,dt \\ &{}+C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}}\int _{{d^{q}_{n}}}^{\infty}t^{-2/q}E {X^{2}_{nk}}I \bigl(d_{n}< |X_{nk}|\leq t^{1/q}\bigr) \,dt \\ \doteq&I_{41}+I_{42}. \end{aligned}$$

For \(I_{41}\), since \(q<2\), we have

$$\begin{aligned} I_{41} =& C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}} E{X^{2}_{nk}}I\bigl(\vert X_{nk}\vert \leq d_{n}\bigr)\int_{{a^{q}_{n}}}^{\infty}t^{-2/q}\,dt \\ \leq&C\sum_{n=1}^{\infty}\sum _{k=1}^{n} \frac{E{X^{2}_{nk}}I(|X_{nk}|\leq d_{n})}{{a^{2}_{n}}} \\ =&C\sum_{n=1}^{\infty}\sum _{k=1}^{n} \frac{E{X^{2}_{nk}}I(|X_{nk}|\leq a_{n})}{{a^{2}_{n}}}+C\sum _{n=1}^{\infty}\sum_{k=1}^{n} \frac {E{X^{2}_{nk}}I(a_{n}< |X_{nk}|\leq d_{n})}{{a^{2}_{n}}} \\ \doteq&I'_{41}+I''_{41}. \end{aligned}$$

Since \(p\leq2\), by Lemma 2.3, it implies \(I'_{41}<\infty\). Now we prove that \(I''_{41}<\infty\). Since \(q<2\) and \((a_{n}+1)/a_{n}\rightarrow1\) as \(n\rightarrow\infty\), by Lemma 2.3 we have

$$\begin{aligned} I''_{41} \leq&C\sum _{n=1}^{\infty}\sum_{k=1}^{n} \frac{{d_{n}}^{2-q}}{{a^{2}_{n}}}E|X_{nk}|^{q}I\bigl(a_{n}< \vert X_{nk}\vert \leq d_{n}\bigr) \\ \leq&C\sum_{n=1}^{\infty}\sum _{k=1}^{n} \biggl(\frac{a_{n}+1}{a_{n}} \biggr)^{2-q}\frac {E|X_{nk}|^{q}I(|X_{nk}|>a_{n})}{{a^{q}_{n}}} \\ \leq&C\sum_{n=1}^{\infty}\sum _{k=1}^{n} \frac {E|X_{nk}|^{q}I(|X_{nk}|>a_{n})}{{a^{q}_{n}}} <\infty. \end{aligned}$$

Let \(t=u^{q}\) in \(I_{42}\). Note that for \(q<2\),

$$\begin{aligned}& \int_{d_{n}}^{\infty}u^{q-3}E{X^{2}_{nk}} I\bigl(d_{n}< \vert X_{nk}\vert \leq u\bigr)\,du \\& \quad = \int_{d_{n}}^{\infty}u^{q-3}E{X^{2}_{nk}} I\bigl(\vert X_{nk}\vert >d_{n}\bigr)\cdot I\bigl(\vert X_{nk}\vert \leq u\bigr)\,du \\& \quad = E \biggl[{X^{2}_{nk}} I\bigl(\vert X_{nk}\vert >d_{n}\bigr)\int_{\vert X_{nk}\vert }^{\infty}u^{q-3}I\bigl(\vert X_{nk}\vert \leq u\bigr)\,du \biggr] \\& \quad = E \biggl[{X^{2} _{nk}}I\bigl(\vert X_{nk}\vert >d_{n}\bigr)\int_{\vert X_{nk}\vert }^{\infty}u^{q-3}\,du \biggr] \\& \quad \leq CE\vert X_{nk}\vert ^{q}I\bigl(\vert X_{nk}\vert >d_{n}\bigr). \end{aligned}$$

Then, by Lemma 2.3 and \(d_{n}>a_{n}\), we have

$$\begin{aligned} I_{42} =&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}} \int_{d_{n}}^{\infty}u^{q-3}E{X^{2}_{nk}} I\bigl(d_{n}< \vert X_{nk}\vert \leq u\bigr)\, du \\ \leq&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}}E|X_{nk}|^{q}I \bigl(\vert X_{nk}\vert >a_{n}\bigr)<\infty. \end{aligned}$$

This completes the proof of Theorem 3.1. □

Theorem 3.2

Let \(\{X_{nk}, k\geq1, n\geq1\} \) be a sequence of NSD random variables, and let \(\{a_{n},n\geq1\}\) be a sequence of positive real numbers such that \(a_{n}\uparrow\infty\). Also, let \(\{\Psi _{k}(t),k\geq1\}\) be a positive even function satisfying (1.2) for \(1\leq q< p\) and \(p>2\). Then conditions (1.3)-(1.5) imply (3.1).

Proof

Following the notation, by a similar argument as in the proof of Theorem 3.1, we can easily prove that \(I_{1}<\infty\), \(I_{3}<\infty\), and that (3.2) and (3.3) hold. To complete the proof, we only need to prove that \(I_{4}<\infty\).

Let \(\delta\geq p\) and \(d_{n}=[a_{n}]+1\). By (3.6), Markov’s inequality, Lemma 2.2, and \(C_{r}\)-inequality, we can get

$$\begin{aligned} I_{4} \leq& C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-\delta /q}E\max _{1\leq j\leq n}\Biggl\vert \sum_{k=1}^{j}(Y_{nk}-E Y_{nk})\Biggr\vert ^{\delta}\,dt \\ \leq&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-\delta/q} \Biggl[\sum _{k=1}^{n}E|Y_{nk}|^{\delta}+ \Biggl( \sum_{k=1}^{n}E{Y^{2}_{nk}} \Biggr)^{\delta/2} \Biggr]\,dt \\ =&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}}\int _{{a^{q}_{n}}}^{\infty}t^{-\delta/q}E|Y_{nk}|^{\delta}\,dt +C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-\delta/q} \Biggl(\sum _{k=1}^{n}E{Y^{2}_{nk}} \Biggr)^{\delta/2}\,dt \\ \doteq&I_{43}+I_{44}. \end{aligned}$$

For \(I_{43}\), we have

$$\begin{aligned} I_{43} =&C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-\delta/q}E|X_{nk}|^{\delta}I\bigl(\vert X_{nk}\vert \leq d_{n}\bigr)\,dt \\ &{}+C\sum_{n=1}^{\infty}\sum _{k=1}^{n}{a^{-q}_{n}}\int _{{d^{q}_{n}}}^{\infty}t^{-\delta/q}E|X_{nk}|^{\delta}I\bigl(d_{n}< |X_{nk}|\leq t^{1/q}\bigr)\,dt \\ \doteq& I'_{43}+I''_{43}. \end{aligned}$$

By a similar argument as in the proof of \(I_{41}<\infty\) and \(I_{42}<\infty\) (replacing the exponent 2 by δ), we can get \(I'_{43}<\infty\), \(I''_{43}<\infty \).

For \(I_{44}\), since \(\delta>2\), we can see that

$$\begin{aligned} I_{44} =&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-\delta /q} \Biggl(\sum _{k=1}^{n} E{X^{2}_{nk}}I\bigl( \vert X_{nk}\vert \leq a_{n}\bigr) \\ &\Biggl.\Biggl.{}+\sum _{k=1}^{n} E{X^{2}_{nk}}I \bigl(a_{n}< |X_{nk}|\leq t^{1/q}\bigr) \Biggr)\Biggr.^{\delta/2}\,dt \\ \leq&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}t^{-\delta /q} \Biggl(\sum _{k=1}^{n} E{X^{2}_{nk}}I\bigl( \vert X_{nk}\vert \leq a_{n}\bigr) \Biggr)^{\delta/2} \,dt \\ &{}+C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}\Biggl(t^{-2/q}\sum _{k=1}^{n} E{X^{2}_{nk}}I \bigl(a_{n}<|X_{nk}|\leq t^{1/q}\bigr) \Biggr)^{\delta/2}\,dt \\ \doteq& I'_{44}+I''_{44}. \end{aligned}$$

Since \(\delta\geq p>q\), from (1.4) we have

$$\begin{aligned} I'_{44} =&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \Biggl(\sum_{k=1}^{n} E{X^{2}_{nk}}I \bigl(\vert X_{nk}\vert \leq a_{n}\bigr) \Biggr)^{\delta/2}\int_{{a^{q}_{n}}}^{\infty}t^{-\delta/q}\,dt \\ \leq&C\sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}\frac {E{X^{2}_{nk}}I(|X_{nk}|\leq a_{n})}{{a^{2}_{n}}} \Biggr)^{\delta/2} \\ \leq&C\sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}\frac {EX^{2}_{nk}}{{a^{2}_{n}}} \Biggr)^{\delta/2}< \infty. \end{aligned}$$

Next we prove that \(I''_{44}<\infty\). To start with, we consider the case \(1\leq q\leq2\). Since \(\delta>2\), by Lemma 2.3 we have

$$\begin{aligned} I''_{44} \leq&C\sum _{n=1}^{\infty}{a^{-q}_{n}}\int _{{a^{q}_{n}}}^{\infty}\Biggl(t^{-1}\sum _{k=1}^{n}E|X_{nk}|^{q}I \bigl(a_{n}< |X_{nk}|\leq t^{1/q}\bigr) \Biggr)^{\delta/2}\,dt \\ \leq&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \int_{{a^{q}_{n}}}^{\infty}\Biggl(t^{-1}\sum _{k=1}^{n}E|X_{nk}|^{q}I\bigl( \vert X_{nk}\vert >a_{n}\bigr) \Biggr)^{\delta/2}\,dt \\ =&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \Biggl(\sum_{k=1}^{n}E|X_{nk}|^{q}I \bigl(\vert X_{nk}\vert >a_{n}\bigr) \Biggr)^{\delta/2} \int_{{a^{q}_{n}}}^{\infty}t^{-\delta/2}\,dt \\ \leq&C\sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}\frac {E|X_{nk}|^{q}I(|X_{nk}|>a_{n})}{{a^{q}_{n}}} \Biggr)^{\delta/2}< \infty. \end{aligned}$$

Finally, we prove that \(I''_{44}<\infty\) in the case \(2< q<p\). Since \(\delta>q\) and \(\delta>2\), we have by Lemma 2.3 that

$$\begin{aligned} I''_{44} \leq&C\sum _{n=1}^{\infty}{a^{-q}_{n}}\int _{{a^{q}_{n}}}^{\infty}\Biggl(t^{-2/q}\sum _{k=1}^{n}E{X^{2}_{nk}}I\bigl( \vert X_{nk}\vert >a_{n}\bigr) \Biggr)^{\delta/2}\,dt \\ =&C\sum_{n=1}^{\infty}{a^{-q}_{n}} \Biggl(\sum_{k=1}^{n}E{X^{2}_{nk}}I \bigl(\vert X_{nk}\vert >a_{n}\bigr) \Biggr)^{\delta/2} \int_{a^{q}_{n}}^{\infty}t^{-\delta/q}\,dt \\ \leq&C\sum_{n=1}^{\infty}\Biggl(\sum _{k=1}^{n}\frac {E{X^{2}_{nk}}I(|X_{nk}|>a_{n})}{{a^{2}_{n}}} \Biggr)^{\delta/2} \\ < & \infty. \end{aligned}$$

Thus we get the desired result immediately. The proof is completed. □

Corollary 3.1

Let \(\{X_{nk},k\geq1, n\geq1\}\) be a sequence of NSD random variables with mean zero. If for some \(\alpha>0\) and \(v\geq2\),

$$ \max_{1\leq k\leq n}E|X_{nk}|^{v}=O \bigl(n^{\alpha}\bigr), $$
(3.7)

where \(\frac{v}{q}-\alpha>\max\{\frac{v}{2},2\}\), \(v\geq2\), then for any \(\varepsilon>0\),

$$ \sum_{n=1}^{\infty}n^{-1}E \Biggl\{ \max_{1\leq j\leq n}\Biggl\vert \sum _{k=1}^{j}X_{nk}\Biggr\vert -\varepsilon n^{\frac {1}{q}} \Biggr\} ^{q}_{+}< \infty. $$
(3.8)

Proof

Put \(\Psi_{k}(|t|)=|t|^{v}\), \(p=v+\delta\), \(\delta >0\), \(a_{n}={n}^{1/q}\).

Since \(v\geq2\), \(\frac{v}{q}-\alpha>\max\{\frac{v}{r},2\}\), then

$$\begin{aligned}& \frac{\Psi_{k}(|t|)}{|t|^{q}}=|t|^{v-q}\uparrow, \\& \frac{\Psi_{k}(|t|)}{|t|^{p}}= \frac{|t|^{v}}{|t|^{p}}=\frac{1}{|t|^{\delta}}\downarrow \quad \mbox{as } |t|\uparrow \infty. \end{aligned}$$

It follows by (3.7) and \(\frac{v}{q}-\alpha>2\) that

$$ \sum_{n=1}^{\infty}\sum _{k=1}^{n}\frac{E\Psi_{k}(X_{nk})}{\Psi_{k}(a_{n})} =\sum _{n=1}^{\infty}\sum_{k=1}^{n} \frac{E|X_{nk}|^{v}}{n^{\frac{v}{q}}} \leq C\sum_{n=1}^{\infty}\frac{1}{n^{\frac{v}{q}-\alpha-1}}< \infty. $$
(3.9)

Since \(v\geq2\), by Jensen’s inequality it follows that

$$ \sum_{k=1}^{n}\frac{E|X_{k}|^{2}}{n^{\frac{2}{q}}} \leq\sum _{k=1}^{n}\frac{(E|X_{k}|^{v})^{\frac{2}{v}}}{n^{\frac{2}{q}}} \leq C \frac{1}{n^{\frac{2}{q}-\frac{2\alpha}{v}-1}}. $$

Clearly \(\frac{2}{q}-\frac{2\alpha}{v}-1>0\). Take \(s>p\) such that \(\frac{s}{2}(\frac{2}{q}-\frac{2\alpha}{v}-1)>1\). Therefore,

$$ \sum_{n=1}^{\infty}\Biggl[\sum _{k=1}^{n}\frac{E|X_{k}|^{2}}{n^{\frac {2}{q}}} \Biggr]^{s/2}< \infty. $$
(3.10)

Combining Theorem 3.2 and (3.9) and (3.10), we can prove Corollary 3.1 immediately. □

Remark 3.1

Noting that, in this paper we consider the case \(1\leq q\leq p\), which is a wider scope than the case \(q=1\) in Shen et al. [7]. In addition, compared with NSD random variables, the arrays of NSD random variables not only have many related properties, but they also have a wide range of application. So it is very significant to study it.

Remark 3.2

Under the condition of Theorem 3.1, we have

$$\begin{aligned} \infty >&\sum_{n=1}^{\infty}a^{-q}_{n} E \Biggl\{ \max_{1\leq j\leq n}\Biggl\vert \sum_{k=1}^{j} X_{nk} \Biggr\vert -\varepsilon a_{n} \Biggr\} _{+}^{q} \\ =&\sum_{n=1}^{\infty}a^{-q}_{n} \int_{0}^{\infty}P \Biggl\{ \max_{1\leq j\leq n} \Biggl\vert \sum_{k=1}^{j} X_{nk}\Biggr\vert -\varepsilon a_{n}>t^{1/q} \Biggr\} \, dt \\ \geq&\sum_{n=1}^{\infty}a^{-q}_{n} \int_{0}^{\varepsilon^{q} a^{q}_{n}} P \Biggl\{ \max_{1\leq j\leq n} \Biggl\vert \sum_{k=1}^{j} X_{nk}\Biggr\vert -\varepsilon a_{n}>\varepsilon a_{n} \Biggr\} \, dt \\ =&\varepsilon^{q}\sum_{n=1}^{\infty}P \Biggl\{ \max_{1\leq j\leq n}\Biggl\vert \sum _{k=1}^{j} X_{nk}\Biggr\vert >2 \varepsilon a_{n} \Biggr\} . \end{aligned}$$

Then \(\frac{1}{a_{n}}\max_{1\leq j\leq n}\vert \sum_{k=1}^{j} X_{nk}\vert \stackrel{n\rightarrow\infty}{\rightarrow}0\), the result of Theorem A is obtained directly. So the result of Theorem 3.1 implies Theorem A, it generalizes the corresponding result of Theorem A.