Complete convergence and complete moment convergence for negatively associated sequences of random variables

Abstract

In this paper, we study the complete convergence and complete moment convergence for negatively associated sequences of random variables with \(\mathbb{E}X=0\), \(\mathbb{E}\exp(\ln^{\alpha}|X| )<\infty\), \(\alpha>1\). As a result, we extend some complete convergence and complete moment convergence theorems for independent random variables to the case of negatively associated random variables without necessarily imposing any extra conditions. Our results generalize corresponding results obtained by Gut and Stadtmüller (Stat. Probab. Lett. 81:1486-1492, 2011) and Qiu and Chen (Stat. Probab. Lett. 91:76-82, 2014).

Introduction and main results

Definition 1.1

Random variables \(X_{1},X_{2},\ldots,X_{n}\), \(n\geq2\), are said to be negatively associated (NA) if for every pair of disjoint subsets \(A_{1}\) and \(A_{2}\) of \(\{1,2,\ldots,n\}\),

$$\operatorname{cov}\bigl(f_{1}(X_{i}; i\in A_{1}),f_{2}(X_{j}; j\in A_{2})\bigr) \leq0, $$

where \(f_{1}\) and \(f_{2}\) are increasing for every variable (or decreasing for every variable) functions such that this covariance exists. A sequence of random variables \(\{X_{i}; i\geq1\}\) is said to be NA if its every finite subfamily is NA.

By Joag-Dev and Proschan (1983 [3]), we have the following lemma.

Lemma 1.2

(Joag-Dev and Proschan, 1983 [3])

Let \(\{X_{i}; i\geq1\}\) be a sequence of NA random variables.

  1. (i)

    If \(\{f_{i}; i\geq1\}\) is a sequence of nondecreasing (or nonincreasing) functions, then \(\{f_{i}(X_{i}); i\geq1\}\) is also a sequence of NA random variables.

  2. (ii)

    Increasing functions defined on disjoint subsets of a set of negatively associated random variables are negatively associated.

This definition was introduced by Joag-Dev and Proschan (1983 [3]). Statistical test depends greatly on sampling. The random sampling without replacement from a finite population is NA, but is not independent. NA sampling has wide applications such as in multivariate statistical analysis and reliability theory. Because of the wide applications of NA sampling, the limit behaviors of NA random variables have received more and more attention recently. One can refer to: Joag-Dev and Proschan (1983 [3]) for fundamental properties, Newman (1984 [4]) for the central limit theorem, Matula (1992 [5]) for the three series theorem, Shao (2000 [6]) for the moment inequalities.

The concept of complete convergence of a sequence of random variables was introduced by Hsu and Robbins (1947 [7]). In view of the Borel-Cantelli lemma, complete convergence implies almost sure convergence. Chow (1988 [8]) first investigated the complete moment convergence, which is more exact than complete convergence. Thus, complete convergence and complete moment convergence are two of the most important problems in probability theory. Their recent results can be found in Wu (2012 [9], 2015 [10]), Xu and Tang (2014 [11]), Guo et al. (2014 [12]), Gut and Stadtmüller (2011 [1]), and Qiu and Chen (2014 [2]). In addition, Gut and Stadtmüller (2011 [1]) and Qiu and Chen (2014 [2]) obtained, respectively, complete convergence and complete moment convergence theorems for independent identically distributed sequences of random variable with \(\mathbb{E}X=0\), \(\mathbb{E}\exp(\ln^{\alpha}|X| )<\infty\), \(\alpha>1\). In this paper, based on Gut and Stadtmüller (2011 [1]) and Qiu and Chen (2014 [2]), we extend the complete convergence and complete moment theorems for independent random variables to the negatively associated sequences of random variables without necessarily imposing any extra conditions, which extend the corresponding results of Gut and Stadtmüller (2011 [1]) and Qiu and Chen (2014 [2]).

In the following, the symbol c stands for a generic positive constant which may differ from one place to another. Let \(a_{n}\ll b_{n}\) denote that there exists a constant \(c>0\) such that \(a_{n}\leq cb_{n}\) for sufficiently large n, lnx means \(\ln(\max(x,\mathrm{e}))\), and I denotes an indicator function.

Theorem 1.3

Let \(\alpha>1\), \(\{X,X_{n};n\geq1\}\) be a sequence of NA identically distributed random variables with partial sums \(S_{n}=\sum_{i=1}^{n}X_{i}\), \(n\geq1\). Suppose that

$$ \mathbb{E}X=0,\qquad\mathbb{E}\exp\bigl(\ln^{\alpha} \vert X\vert \bigr)< \infty, $$
(1.1)

then

$$ \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2}}P \Bigl(\max_{1\leq k\leq n}|S_{k}|>n \beta\Bigr)< \infty\quad\textit{for all } \beta>1. $$
(1.2)

Conversely, if (1.2) holds for some \(\beta>0\), then \(\mathbb{E}\exp(\ln^{\alpha}|X/(2\beta)| )<\infty\); furthermore, if \(\beta\leq1/2\), then \(\mathbb{E}\exp(\ln^{\alpha}|X|)<\infty\), if \(\beta>1/2\), then \(\mathbb{E}\exp((1-\lambda)\ln^{\alpha}|X| )<\infty\) for any \(\lambda>0\).

Theorem 1.4

Assume that the conditions of Theorem  1.3 and (1.1) hold. Then

$$ \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}}\mathbb{E} \Bigl\{ \max _{1\leq k\leq n}|S_{k}|-\beta n \Bigr\} _{+}^{q}< \infty \quad\quad\textit{for all } \beta>1 \textit{ and all } q>0. $$
(1.3)

Conversely, if (1.3) holds for some \(\beta>0\), then \(\mathbb{E}\exp(\ln^{\alpha}|X/(2\beta)| )<\infty\).

Remark 1.5

By mimicking the analogous part in the proof of Theorem 2.1 in Qiu and Chen (2014 [2]), (1.2) and (1.3) imply, respectively,

$$\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2}}P \biggl(\sup_{k\geq n} \biggl\vert \frac{S_{k}}{k}\biggr\vert >\beta\biggr)< \infty\quad \mbox{for all } \beta>1 $$

and

$$\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2}}\mathbb{E} \biggl\{ \sup _{1\leq k\leq n}\biggl\vert \frac{S_{k}}{k}\biggr\vert -\beta\biggr\} _{+}^{q}< \infty\quad\mbox{for all } \beta>1 \mbox{ and all } q>0. $$

Remark 1.6

Corresponding results of Gut and Stadtmüller (2011 [1]) and Qiu and Chen (2014 [2]) are the special cases of our Theorems 1.3 and 1.4 when \(\{X, X_{n}; n\geq1\}\) is i.i.d.

Proofs

The following two lemmas will be useful in the proofs of our theorems, and the first is due to Shao (2000 [6]).

Lemma 2.1

(Shao, 2000 [6], Theorem 3)

Let \(\{X_{i}; 1\leq i\leq n\}\) be a sequence of negatively associated random variables with zero means and finite second moments. Let \(S_{k}=\sum_{i=1}^{k}X_{i}\) and \(B_{n}=\sum_{i=1}^{n}\mathbb{E}X_{i}^{2}\). Then, for all \(y>0\), \(a>0\) and \(0<\theta<1\),

$$P \Bigl(\max_{1\leq k\leq n}S_{k}\geq y \Bigr)\leq P \Bigl( \max_{1\leq k\leq n}X_{k}>a \Bigr)+{1\over 1-\theta}\exp \biggl(-{y^{2}\theta\over 2(ay+B_{n})} \biggl\{ 1+{2\over 3}\ln\biggl(1+ {ay\over B_{n}} \biggr) \biggr\} \biggr). $$

Lemma 2.2

For any random variable X and \(\alpha>0\),

$$\mathbb{E}\exp\bigl(\ln^{\alpha} \vert X\vert \bigr)< \infty\quad \Leftrightarrow\quad\sum_{n=1}^{\infty}\exp \bigl(\ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n}P\bigl(\vert X\vert > n \bigr)< \infty. $$

Proof

Let \(a_{n}\approx b_{n}\) denote that there exist constants \(c_{1}>0\) and \(c_{2}>0\) such that \(c_{1}a_{n}\leq b_{n}\leq c_{2}a_{n}\) for sufficiently large n. We have

$$\begin{aligned}& \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n}P\bigl(\vert X\vert > n\bigr) \\& \quad=\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n}\sum_{j=n}^{\infty}P\bigl(j< \vert X\vert \leq j+1\bigr) \\& \quad=\sum_{j=1}^{\infty}P\bigl(j< \vert X \vert \leq j+1\bigr)\sum_{n=1}^{j}\exp \bigl(\ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n} \\& \quad\approx\sum_{j=1}^{\infty}\exp\bigl( \ln^{\alpha}j \bigr)\mathbb{E}I\bigl(j< \vert X\vert \leq j+1\bigr) \\& \quad\approx\sum_{j=1}^{\infty}\mathbb{E}\exp \bigl(\ln^{\alpha} \vert X\vert \bigr)I\bigl(j< \vert X\vert \leq j+1\bigr) \\& \quad\approx\mathbb{E}\exp\bigl(\ln^{\alpha} \vert X\vert \bigr), \end{aligned}$$

it follows that Lemma 2.2 holds. □

Proof of Theorem 1.3

Let \(\beta>1\) be arbitrary, set, for \(n\geq1\), \(b_{n}=\beta n/(10\ln^{\alpha}n)\), define, for \(1\leq k\leq n\),

$$\begin{aligned}& X'_{k}=X_{k}I\{X_{k}\leq b_{n}\}+b_{n}I\{X_{k}>b_{n}\},\qquad S'_{n}=\sum_{k=1}^{n}X'_{k}, \\& X''_{k}=(X_{k}-b_{n})I \{b_{n}< X_{k}\leq n\}, \qquad X'''_{k}=(X_{k}-b_{n})I \{X_{k}> n\}. \end{aligned}$$

Obviously, \(X_{k}=X'_{k}+X''_{k}+X'''_{k}\) and \(X'_{k}\) is increasing on \(X_{k}\), thus, by Lemma 1.2(i), \(\{X'_{k}; k\geq1\}\) is also a sequence of NA random variables. Note that

$$\begin{aligned}& \Bigl\{ \max_{1\leq k\leq n}S_{k}>n\beta\Bigr\} \\& \quad\subseteq\Bigl\{ \max_{1\leq k\leq n}S_{k}>n\beta\mbox{ and } X_{k}\leq b_{n} \mbox{ for all } k\leq n \Bigr\} \\& \qquad{}\cup\Bigl\{ \max_{1\leq k\leq n}S_{k}>n\beta\mbox{ and } b_{n}< X_{k_{0}}\leq n \mbox{ for exactly one } k_{0}\leq n \mbox{ and} \\& \qquad X_{j}\leq b_{n} \mbox{ for all } j\neq k_{0} \Bigr\} \\& \qquad{}\cup\bigl\{ X''_{k}\neq0 \mbox{ for at least two } k\leq n\bigr\} \\& \qquad {}\cup\bigl\{ X'''_{k} \neq0 \mbox{ for at least one } k\leq n\bigr\} \\& \quad\, \hat{=}\, A_{n}\cup B_{n}\cup C_{n}\cup D_{n}. \end{aligned}$$

Therefore,

$$ P \Bigl(\max_{1\leq k\leq n}S_{k}>n\beta\Bigr)\leq P(A_{n})+P(B_{n})+P(C_{n})+P(D_{n}). $$
(2.1)

By condition (1.1), \(\mathbb{E}X=0\), and \(\mathbb{E}\exp(\ln^{\alpha}|X| )<\infty\), \(\alpha>1\), we get \(\mathbb{E}XI(X\leq b_{n})=-\mathbb{E}XI(X> b_{n})\) and \(\mathbb {E}X^{2}<\infty\). It is well known that \(\mathbb{E}X^{2}<\infty\) implies \(\mathbb {E}X^{2}I(|X|> b_{n})\rightarrow0\), \(n\rightarrow\infty\), and we set \(\delta\,\hat{=}\,1-\beta^{-1}>0\), for sufficiently large n,

$$\begin{aligned} \max_{1\leq k\leq n}\bigl\vert \mathbb{E}S'_{k} \bigr\vert \leq&\max_{1\leq k\leq n}\bigl\vert k\mathbb{E}XI(X\leq b_{n})\bigr\vert +nb_{n}\mathbb{E}I(X> b_{n}) \\ \leq& n\mathbb{E}\vert X\vert I\bigl(\vert X\vert > b_{n} \bigr)+nb^{-1}_{n}\mathbb{E}X^{2}I\bigl(\vert X \vert > b_{n}\bigr) \\ \leq&\frac{2n\mathbb{E}X^{2}I(\vert X\vert > b_{n})}{b_{n}} =\frac{20\ln ^{\alpha}n}{\beta}\mathbb{E}X^{2}I\bigl( \vert X\vert > b_{n}\bigr) \\ \leq& \beta\delta\ln^{\alpha}n, \end{aligned}$$
(2.2)

so that, taking \(y=(n-\delta\ln^{\alpha}n)\beta\), \(a=2b_{n}\), \(\theta=4/5\) in Lemma 2.1, for sufficiently large n, we get

$$\begin{aligned} P(A_{n}) \leq& P \Bigl(\max_{1\leq k\leq n}S'_{k}>n \beta\Bigr) \\ \leq& P \Bigl(\max_{1\leq k\leq n}\bigl(S'_{k}- \mathbb{E}S'_{k}\bigr)>\bigl(n-\delta\ln^{\alpha}n \bigr)\beta\Bigr) \\ \ll& \exp\biggl(-\frac{4(n-\delta\ln^{\alpha}n)^{2}\beta^{2}}{10(\frac {\beta^{2} n(n-\delta\ln^{\alpha}n)}{5\ln^{\alpha}n}+n\mathbb{E}X^{2})} \biggr) \\ =& \exp\biggl(-\frac{2\beta^{2} (1-\frac{\delta\ln^{\alpha}n}{n} )^{2}}{\beta^{2} (1-\frac{\delta\ln^{\alpha}n}{n} )+\frac{5\mathbb{E}X^{2}\ln ^{\alpha}n}{n}}\ln^{\alpha}n \biggr) \\ \leq& \exp\bigl(-\ln^{\alpha}n\bigr), \end{aligned}$$
(2.3)

from \(\frac{2\beta^{2} (1-\frac{\delta\ln^{\alpha}n}{n} )^{2}}{\beta^{2} (1-\frac{\delta\ln^{\alpha}n}{n} )+\frac{5\mathbb{E}X^{2}\ln^{\alpha}n}{n}}\rightarrow2>1\) as \(n\rightarrow\infty\). By the Markov inequality, (1.1), and \((\ln n+\ln(\beta /10)-\alpha\ln\ln n)^{\alpha}/\ln^{\alpha}n\rightarrow1<1-\delta/2\) as \(n\rightarrow\infty\), for sufficiently large n, \((\ln n+\ln(\beta /10)-\alpha\ln\ln n)^{\alpha}\leq(1-\delta/2)\ln^{\alpha}n\), thus,

$$\begin{aligned} P\bigl(\vert X\vert >b_{n}\bigr) \leq&\frac{\mathbb{E}\exp(\ln^{\alpha}|X|) }{\exp(\ln^{\alpha}b_{n})} \\ \ll& \frac{1}{\exp(\ln n+\ln(\beta/10)-\alpha\ln\ln n)^{\alpha}} \\ \leq&\exp\bigl(-(1-\delta/2)\ln^{\alpha}n\bigr), \end{aligned}$$
(2.4)

and, hence, by combining (2.3) and Lemma 1.2(ii), \(\max_{1\leq k\leq n}\sum_{1\leq i\leq k, i\neq k_{0}}X'_{i}\) and \(X_{k_{0}}\) are NA random variable, we get

$$\begin{aligned} P(B_{n}) \leq& P \biggl(\exists1\leq k_{0}\leq n \mbox{ such that } \max_{1\leq k\leq n}\sum_{1\leq i\leq k, i\neq k_{0}}X'_{i}> \beta n-n, X_{k_{0}}>b_{n} \biggr) \\ \leq&\sum_{k_{0}=1}^{n} P \biggl(\max _{1\leq k\leq n}\sum_{1\leq i\leq k, i\neq k_{0}}X'_{i}> \beta n-n=\beta\delta n \biggr)P(X_{k_{0}}>b_{n}). \end{aligned}$$
(2.5)

Similar to the proof of (2.2), we have \(\max_{1\leq k\leq n}|\mathbb{E}\sum_{1\leq i\leq k, i\neq k_{0}}X'_{i}|\leq\beta\delta\ln ^{\alpha}n\), so that, taking \(y=\beta\delta(n-\ln^{\alpha}n)\), \(a=2b_{n}\), \(\theta=4/5\) in Lemma 2.1, using the fact that \(\frac{2\beta^{2}\delta (1-\frac{\ln^{\alpha}n}{n} )^{2}}{\beta^{2}\delta(1-\frac{\ln^{\alpha}n}{n} )+\frac{5\mathbb{E}X^{2}(n-1)\ln^{\alpha}n}{n^{2}}}\rightarrow2>1\) as \(n\rightarrow\infty\), for sufficiently large n, we get

$$\begin{aligned}& P \biggl(\max_{1\leq k\leq n}\sum_{1\leq i\leq k, i\neq k_{0}}X'_{i}> \beta\delta n \biggr) \\& \quad\leq P \biggl(\max_{1\leq k\leq n}\sum _{1\leq i\leq k, i\neq k_{0}}\bigl(X'_{i}- \mathbb{E}X'_{i}\bigr)>\beta\delta\bigl(n- \ln^{\alpha}n\bigr) \biggr) \\& \quad\ll\exp\biggl(-\frac{4(n-\ln^{\alpha}n)^{2}\beta^{2}\delta ^{2}}{10(\frac{\beta^{2} \delta n(n-\ln^{\alpha}n)}{5\ln^{\alpha}n}+(n-1)\mathbb{E}X^{2})} \biggr) \\& \quad= \exp\biggl(-\frac{2\beta^{2}\delta(1-\frac{\ln^{\alpha}n}{n} )^{2}}{\beta^{2}\delta(1-\frac{\ln^{\alpha}n}{n} )+\frac{5\mathbb {E}X^{2}(n-1)\ln^{\alpha}n}{n^{2}}}\delta\ln^{\alpha}n \biggr) \\& \quad\leq\exp\bigl(-\delta\ln^{\alpha}n\bigr). \end{aligned}$$

Substituting the above inequality and (2.4) in (2.5), we obtain

$$\begin{aligned} P(B_{n}) \ll& n\exp\bigl(-\delta\ln^{\alpha}n-(1-\delta/2) \ln^{\alpha}n\bigr) \\ =&\exp\bigl(-\ln^{\alpha}n\bigr)\frac{n}{(\mathrm {e}^{\ln n})^{(\delta\ln^{\alpha-1}n)/2}} \\ \leq&\exp\bigl(-\ln^{\alpha}n\bigr). \end{aligned}$$
(2.6)

By (2.4),

$$\begin{aligned} P(C_{n}) =&P \bigl(\exists1\leq k_{1}< k_{2}\leq n \mbox{ such that } X''_{k_{1}}\neq0, X''_{k_{2}}\neq0 \bigr) \\ \leq& \sum_{1\leq k_{1}< k_{2}\leq n} P(X_{k_{1}}>b_{n}, X_{k_{2}}>b_{n})\leq n^{2} P^{2}\bigl( \vert X\vert >b_{n}\bigr) \\ \ll& n^{2} \bigl(\exp\bigl(-(1-\delta/2)\ln^{\alpha}n\bigr) \bigr)^{2} \\ =&n^{2}\exp\bigl(-2(1-\delta/2)\ln^{\alpha}n \bigr)=n^{2}\exp\bigl(-1-(1-\delta)\ln^{\alpha}n\bigr) \\ =& \exp\bigl(-\ln^{\alpha}n\bigr)\frac{n^{2}}{(\mathrm{e}^{\ln n})^{(1-\delta)\ln^{\alpha-1}n}} \\ \leq&\exp\bigl(-\ln^{\alpha}n\bigr). \end{aligned}$$
(2.7)

This, together with (2.1), (2.3), (2.5), and (2.6), shows

$$ P \Bigl(\max_{1\leq k\leq n}S_{k}>\beta n \Bigr) \ll\exp \bigl(-\ln^{\alpha}n\bigr)+nP\bigl(\vert X\vert > n\bigr). $$
(2.8)

Because \(-X_{k}\) is decreasing on \(X_{k}\), by Lemma 1.2(i), \(\{-X, -X_{k}; k\geq1\}\) is also a sequence of NA random variables. Obviously, \(\{-X, -X_{k}; k\geq1\}\) also satisfies the condition (1.1). Therefore, replacing \(X_{k}\) by \(-X_{k}\) in (2.8), we get

$$P \Bigl(\max_{1\leq k\leq n}(-S_{k})>\beta n \Bigr) \ll\exp \bigl(-\ln^{\alpha}n\bigr)+nP\bigl(\vert X\vert > n\bigr). $$

Thus,

$$\begin{aligned} P \Bigl(\max_{1\leq k\leq n}|S_{k}|>\beta n \Bigr) \leq& P \Bigl(\max_{1\leq k\leq n}S_{k}>\beta n \Bigr)+P \Bigl(\max _{1\leq k\leq n}(-S_{k})>\beta n \Bigr) \\ \ll&\exp\bigl(-\ln^{\alpha}n\bigr)+nP\bigl(\vert X\vert > n\bigr). \end{aligned}$$
(2.9)

From (1.1) and Lemma 2.2,

$$\begin{aligned}& \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2}}P \Bigl(\max_{1\leq k\leq n}|S_{k}|> \beta n \Bigr) \\& \quad\ll\sum_{n=1}^{\infty}\frac{\ln^{\alpha-1}n}{n^{2}}+ \sum_{n=1}^{\infty}\exp\bigl(\ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n}P\bigl(\vert X\vert > n\bigr) \\& \quad< \infty. \end{aligned}$$

That is, (1.2) holds.

Conversely, if (1.2) holds, then combining with \(\max_{1\leq k\leq n}|X_{k}|\leq2\max_{1\leq k\leq n}|S_{k}|\), it follows that

$$ \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2}}P \Bigl(\max_{1\leq k\leq n}|X_{k}|>2 \beta n \Bigr)< \infty, $$
(2.10)

it implies that \(P(\max_{1\leq k\leq n}|X_{k}|>2\beta n)\rightarrow0\), \(n\rightarrow\infty\), hence, for sufficiently large n,

$$ P \Bigl(\max_{1\leq k\leq n}|X_{k}|>2\beta n \Bigr)< \frac{1}{2}. $$
(2.11)

Obviously, NA implies pairwise negative quadrant dependent (PNQD) from their definitions. Thus, by Lemma 1.4 of Wu (2012 [9]),

$$\Bigl(1-P \Bigl(\max_{1\leq k\leq n}|X_{k}|>2\beta n \Bigr) \Bigr)^{2}\sum_{k=1}^{n} P\bigl( \vert X_{k}\vert >2\beta n\bigr)\leq cP \Bigl(\max _{1\leq k\leq n}|X_{k}|>2\beta n \Bigr), $$

from which, combining with (2.11), we have

$$nP\bigl(\vert X\vert >2\beta n\bigr)\leq c P \Bigl(\max_{1\leq k\leq n}|X_{k}|>2 \beta n \Bigr). $$

Consequently, by (2.10),

$$\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n}P \biggl(\frac{|X|}{2\beta }> n \biggr)< \infty, $$

and, hence, we have \(\mathbb{E}\exp(\ln^{\alpha}|X/(2\beta)| )<\infty\) from Lemma 2.1. Therefore, if \(0<\beta\leq1/2\), then \(\mathbb{E}\exp(\ln^{\alpha}|X| )\leq\mathbb{E}\exp(\ln^{\alpha}|X/(2\beta )| )<\infty\), if \(\beta>1/2\), then for any \(\lambda>0\),

$$\frac{(1-\lambda)\ln^{\alpha}x}{\ln^{\alpha}(x/(2\beta))}\rightarrow 1-\lambda< 1, \quad\mbox{as } x\rightarrow+\infty. $$

This implies that there exists a constant M such that for all \(x\geq M\), we have \((1-\lambda)\ln^{\alpha}x\leq\ln^{\alpha}(x/(2\beta))\). Hence,

$$\begin{aligned} \mathbb{E}\exp\bigl((1-\lambda)\ln^{\alpha} \vert X\vert \bigr) =& \mathbb{E}\exp\bigl((1-\lambda)\ln^{\alpha} \vert X\vert \bigr)I\bigl( \vert X\vert \leq M\bigr) \\ &{}+\mathbb{E}\exp\bigl((1-\lambda)\ln^{\alpha} \vert X\vert \bigr)I\bigl(\vert X\vert > M\bigr) \\ \leq& c+\mathbb{E}\exp\bigl(\ln^{\alpha}\bigl\vert X/(2\beta)\bigr\vert \bigr)I\bigl(\vert X\vert > M\bigr) \\ \ll&\mathbb{E}\exp\bigl( \ln^{\alpha}\bigl\vert X/(2\beta)\bigr\vert \bigr) \\ < &\infty. \end{aligned}$$

This completes the proof of Theorem 1.3. □

Proof of Theorem 1.4

Note that

$$\begin{aligned}& \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}}\mathbb{E} \Bigl\{ \max _{1\leq k\leq n}|S_{k}|-\beta n \Bigr\} _{+}^{q} \\& \quad= \beta^{q}\sum_{n=1}^{\infty}\exp\bigl(\ln^{\alpha}n\bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}} \int_{0}^{n} q x^{q-1} P \Bigl(\max _{1\leq k\leq n}|S_{k}|-\beta n>\beta x \Bigr)\,\mathrm{d}x \\& \qquad{}+\beta^{q}\sum_{n=1}^{\infty}\exp\bigl(\ln^{\alpha}n\bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}} \int_{n}^{\infty} q x^{q-1}P \Bigl(\max _{1\leq k\leq n}|S_{k}|-\beta n>\beta x \Bigr)\,\mathrm{d}x \\& \quad\ll\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n\bigr)\frac{\ln^{\alpha-1}n}{n^{2}}P \Bigl(\max_{1\leq k\leq n}|S_{k}|> \beta n \Bigr) \\& \qquad{}+\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}} \int_{n}^{\infty} x^{q-1}P \Bigl(\max _{1\leq k\leq n}|S_{k}|>\beta x \Bigr)\,\mathrm{d}x. \end{aligned}$$

Hence, by (1.2), in order to establish (1.3), it suffices to prove that

$$ \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}} \int_{n}^{\infty} x^{q-1}P \Bigl(\max _{1\leq k\leq n}|S_{k}|>\beta x \Bigr)\,\mathrm{d}x< \infty. $$
(2.12)

Let \(\beta>1\) be an arbitrary, set, for \(x\geq n\), \(b_{x}=\beta x/(10\ln^{\alpha}x)\), define, for \(1\leq k\leq n\),

$$\begin{aligned}& Y'_{k}=X_{k}I\{X_{k}\leq b_{x}\}+b_{x}I\{X_{k}>b_{x}\},\qquad U'_{n}=\sum_{k=1}^{n}Y'_{k}, \\& Y''_{k}=(X_{k}-b_{x})I \{b_{x}< X_{k}\leq x\},\qquad Y'''_{k}=(X_{k}-b_{x})I \{X_{k}> x\}. \end{aligned}$$

By similar methods to the proof of (2.1), we have

$$ P \Bigl(\max_{1\leq k\leq n}S_{k}>x\beta\Bigr)\leq P(A_{x})+P(B_{x})+P(C_{x})+P(D_{x}), $$
(2.13)

which leads to

$$\begin{aligned}& A_{x}= \Bigl\{ \max_{1\leq k\leq n}U'_{k}>x \beta\Bigr\} , \\& B_{x}= \Bigl\{ \max_{1\leq k\leq n}S_{k}>x\beta \mbox{ and } b_{x}< X_{k_{0}}\leq x \mbox{ for exactly one } k_{0}\leq n \mbox{ and } X_{j}\leq b_{x}\mbox{ for all } j\neq k_{0} \Bigr\} , \\& C_{x}=\bigl\{ Y''_{k}\neq0 \mbox{ for at least two } k\leq n\bigr\} ,\qquad D_{x}= \bigl\{ Y'''_{k}\neq0 \mbox{ for at least one } k\leq n\bigr\} . \end{aligned}$$

Using similar methods to those used in the proof of (2.3)-(2.7), for \(\delta\,\hat{=}\,1-\beta^{-1}>0\) and \(x\geq n\), we have \(\max_{1\leq k\leq n}|\mathbb{E}U'_{k}|\leq\beta\delta\ln^{\alpha}x\), and

$$\begin{aligned}& P(A_{x})\ll\exp\bigl(-\ln^{\alpha}x\bigr), \\& P\bigl(\vert X\vert >b_{x}\bigr)\ll\exp\bigl(-(1-\delta/2) \ln^{\alpha}x\bigr), \\& P(B_{x})\ll n\exp\bigl(-\delta\ln^{\alpha}x-(1-\delta/2) \ln^{\alpha}x\bigr) \\& \hphantom{P(B_{x})}=\exp\bigl(-\ln^{\alpha}x\bigr)\frac{n}{x^{(\delta\ln ^{\alpha-1}n)/2}}\leq\exp \bigl(-\ln^{\alpha}x\bigr), \\& P(C_{x})\leq n^{2} P^{2}\bigl(\vert X\vert >b_{x}\bigr)\ll\exp\bigl(-\ln^{\alpha}x\bigr)n^{2} \exp\bigl(-(1-\delta)\ln^{\alpha}x\bigr) \\& \hphantom{P(C_{x})}\leq\exp\bigl(-\ln^{\alpha}x \bigr), \\& P(D_{x})\leq n P(X>x)\leq n P\bigl(\vert X\vert >x\bigr), \end{aligned}$$

which, combining with (2.13), shows

$$P \Bigl(\max_{1\leq k\leq n}S_{k}>x\beta\Bigr)\ll\exp\bigl(- \ln^{\alpha}x\bigr)+n P\bigl(\vert X\vert >x\bigr). $$

Replacing \(X_{k}\) by \(-X_{k}\) in the above inequality, we have

$$P \Bigl(\max_{1\leq k\leq n}(-S_{k})>x\beta\Bigr)\ll\exp \bigl(-\ln^{\alpha}x\bigr)+n P\bigl(\vert X\vert >x\bigr). $$

Therefore,

$$\begin{aligned} \begin{aligned} P \Bigl(\max_{1\leq k\leq n}|S_{k}|>x\beta\Bigr)&\leq P \Bigl(\max_{1\leq k\leq n}S_{k}>x\beta\Bigr)+P \Bigl(\max _{1\leq k\leq n}(-S_{k})>x\beta\Bigr) \\ &\ll\exp\bigl(-\ln^{\alpha}x\bigr)+n P\bigl(\vert X\vert >x\bigr). \end{aligned} \end{aligned}$$

Hence,

$$\begin{aligned}& \int_{n}^{\infty} x^{q-1}P \Bigl(\max _{1\leq k\leq n}|S_{k}|>x\beta\Bigr)\,\mathrm{d}x \\ & \quad\ll \int_{n}^{\infty} x^{q-1}\exp\bigl(- \ln^{\alpha}x \bigr)\,\mathrm{d}x+ \int_{n}^{\infty} x^{q-1}n P\bigl(\vert X \vert >x\bigr)\,\mathrm{d}x \\ & \quad\, \hat{=}\,I_{1}+I_{2}. \end{aligned}$$
(2.14)

By the fact that \((a+b)^{\alpha}\geq a^{\alpha}+b^{\alpha}\) for any \(a, b>0\) and \(\alpha>1\),

$$\begin{aligned}& \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}}I_{1} \\ & \quad=\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}} \int_{1}^{\infty}n^{q} t^{q-1}\exp \bigl(-(\ln n+\ln t)^{\alpha}\bigr)\,\mathrm{d}t \\ & \quad\leq\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}}n^{q}\exp\bigl(- \ln^{\alpha}n\bigr) \int_{1}^{\infty} t^{q-1}\exp\bigl(- \ln^{\alpha}t \bigr)\,\mathrm{d}t \\ & \quad\ll\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}}n^{q}\exp\bigl(- \ln^{\alpha}n\bigr) \\ & \quad= \sum_{n=1}^{\infty}\frac{\ln^{\alpha-1}n}{n^{2}}< \infty. \end{aligned}$$
(2.15)

By (1.1) and Lemma 2.2,

$$\begin{aligned}& \sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{2+q}}I_{2} \\ & \quad=\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{1+q}} \int_{n}^{\infty} x^{q-1}P\bigl(\vert X\vert >x\bigr)\,\mathrm{d}x \\ & \quad=\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{1+q}}\sum_{j=n}^{\infty}\int_{j}^{j+1} x^{q-1}P\bigl(\vert X \vert >x\bigr)\, \mathrm{d}x \\ & \quad\ll\sum_{n=1}^{\infty}\exp\bigl( \ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{1+q}}\sum_{j=n}^{\infty}P\bigl(\vert X\vert >j\bigr)j^{q-1} \\ & \quad=\sum_{j=1}^{\infty}P\bigl(\vert X \vert >j\bigr)j^{q-1}\sum_{n=1}^{j} \exp\bigl(\ln^{\alpha}n \bigr)\frac{\ln^{\alpha-1}n}{n^{1+q}} \\ & \quad\ll\sum_{j=1}^{\infty}\exp\bigl( \ln^{\alpha}j \bigr)\frac{\ln^{\alpha-1}j}{j}P\bigl(\vert X\vert >j\bigr) < \infty, \end{aligned}$$

from which, combining with (2.14) and (2.15), we see that (1.3) holds.

Conversely, (1.3) implies (1.2), that is, the conclusion was established. This completes the proof of Theorem 1.4. □

References

  1. 1.

    Gut, A, Stadtmüller, U: An intermediate Baum-Katz theorem. Stat. Probab. Lett. 81, 1486-1492 (2011)

    MathSciNet  Article  MATH  Google Scholar 

  2. 2.

    Qiu, DH, Chen, PY: Complete moment convergence for i.i.d. random variables. Stat. Probab. Lett. 91, 76-82 (2014)

    MathSciNet  Article  MATH  Google Scholar 

  3. 3.

    Joag-Dev, K, Proschan, F: Negative association of random variables with applications. Ann. Stat. 11(1), 286-295 (1983)

    MathSciNet  Article  MATH  Google Scholar 

  4. 4.

    Newman, CM: Asymptotic independence and limit theorems for positively and negatively dependent variables. In: Tong, YL (ed.) Inequalities in Statistics and Probability, pp. 127-140. Institute of Mathematical Statistics, Hayward (1984)

    Google Scholar 

  5. 5.

    Matula, PA: A note on the almost sure convergence of sums of negatively dependent random variables. Stat. Probab. Lett. 15, 209-213 (1992)

    MathSciNet  Article  MATH  Google Scholar 

  6. 6.

    Shao, QM: A comparison theorem on moment inequalities between negatively associated and independent random variables. J. Theor. Probab. 13(2), 343-356 (2000)

    MathSciNet  Article  MATH  Google Scholar 

  7. 7.

    Hsu, PL, Robbins, H: Complete convergence and the law of large numbers. Proc. Natl. Acad. Sci. USA 33, 25-31 (1947). doi:10.1073/pnas.33.2.25

    MathSciNet  Article  MATH  Google Scholar 

  8. 8.

    Chow, Y: On the rate of moment convergence of sample sums and extremes. Bull. Inst. Math. Acad. Sin. 16, 177-201 (1988)

    MathSciNet  MATH  Google Scholar 

  9. 9.

    Wu, QY: Sufficient and necessary conditions of complete convergence for weighted sums of PNQD random variables. J. Appl. Math. 2012, Article ID 104390 (2012)

    MathSciNet  MATH  Google Scholar 

  10. 10.

    Wu, QY: Further study complete convergence for weighted sums of PNQD random variables. J. Inequal. Appl. 2015, 289 (2015). doi:10.1186/s13660-015-0814-1

    MathSciNet  Article  MATH  Google Scholar 

  11. 11.

    Xu, H, Tang, L: On complete convergence for arrays of rowwise AANA random variables. Stoch. Int. J. Probab. Stoch. Process. 86(3), 371-381 (2014)

    MathSciNet  Article  MATH  Google Scholar 

  12. 12.

    Guo, ML, Xu, CY, Zhu, DJ: Complete convergence of weighted sums for arrays of rowwise m-negatively associated random variables. Commun. Math. Res. 30(1), 41-50 (2014)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are very grateful to the referees and the editors for their valuable comments and some helpful suggestions, which improved the clarity and readability of the paper. This work was supported by the National Natural Science Foundation of China (11361019) and the Support Program of the Guangxi China Science Foundation (2015GXNSFAA139008).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Yuanying Jiang.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

QW conceived of the study, drafted, completed, read, and approved the final manuscript. YJ conceived of the study, and drafted and approved the final manuscript.

Authors’ information

Qunying Wu: Professor, Doctor, working in the field of probability and statistics. Yuanying Jiang: Associate professor, Doctor, working in the field of probability and statistics.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wu, Q., Jiang, Y. Complete convergence and complete moment convergence for negatively associated sequences of random variables. J Inequal Appl 2016, 157 (2016). https://doi.org/10.1186/s13660-016-1107-z

Download citation

MSC

  • 60F15

Keywords

  • negatively associated random variables
  • complete convergence
  • complete moment convergence