1 Introduction

The concept of complete convergence was introduced by Hsu and Robbins [1]. A sequence { X n ,n1} of random variables is said to converge completely to the constant θ if

n = 1 P ( | X n θ | > ϵ ) < for all  ϵ > 0 .

Hsu and Robbins [1] proved that the sequence of arithmetic means of i.i.d. random variables converges completely to the expected value if the variance of the summands is finite. Erdös [2] proved the converse.

The result of Hsu, Robbins, and Erdös has been generalized and extended in several directions. Baum and Katz [3] proved that if { X n ,n1} is a sequence of i.i.d. random variables with E X 1 =0, E | X 1 | p t < (1p<2, t1) is equivalent to

n = 1 n t 2 P ( | i = 1 n X i | > ϵ n 1 / p ) < for all  ϵ > 0 .
(1.1)

Chow [4] generalized the result of Baum and Katz [3] by showing the following complete moment convergence. If { X n ,n1} is a sequence of i.i.d. random variables with E X 1 =0 and E( | X 1 | p t +| X 1 |log(1+| X 1 |))< for some 0<p<2, t1, and pt1, then

n = 1 n t 2 1 / p E ( max 1 k n | i = 1 k X i | ϵ n 1 / p ) + < for all  ϵ > 0 ,
(1.2)

where x + =max{x,0}. Note that (1.2) implies (1.1). Li and Spătaru [5] gave a refinement of the result of Baum and Katz [3] as follows. Let { X n ,n1} be a sequence of i.i.d. random variables with E X 1 =0, and let 0<p<2, t1, q>0, and pt1. Then

{ E | X 1 | q < if  q > p t , E | X 1 | p t log ( 1 + | X 1 | ) < if  q = p t , E | X 1 | p t < if  q < p t ,
(1.3)

if and only if

ϵ n = 1 n t 2 P ( | i = 1 n X i | > x 1 / q n 1 / p ) dx< for all  ϵ > 0 .

Recently, Chen and Wang [6] proved that for any q>0, any sequences { a n ,n1} and { b n ,n1} of positive real numbers and any sequence { Z n ,n1} of random variables,

ϵ n = 1 b n P ( | Z n | > x 1 / q a n ) dx< for all  ϵ > 0

and

n = 1 b n a n q E ( | Z n | ϵ a n ) + q < for all  ϵ > 0 ,

are equivalent. Therefore, if { X n ,n1} is a sequence of i.i.d. random variables with E X 1 =0 and 0<p<2, t1, q>0, and pt1, then the moment condition (1.3) is equivalent to

n = 1 n t 2 q / p E ( | i = 1 n X i | ϵ n 1 / p ) + q < for all  ϵ > 0 .
(1.4)

When q=1, the complete q th moment convergence (1.4) is reduced to complete moment convergence.

The complete q th moment convergence for dependent random variables was established by many authors. Chen and Wang [7] showed that (1.3) and (1.4) are equivalent for φ-mixing random variables. Zhou and Lin [8] established complete q th moment convergence theorems for moving average processes of φ-mixing random variables. Wu et al. [9] obtained complete q th moment convergence results for arrays of rowwise ρ -mixing random variables.

The purpose of this paper is to provide sets of sufficient conditions for complete q th moment convergence of the form

n = 1 b n a n q E ( max 1 k n | i = 1 k X n i | ϵ a n ) + q < for all  ϵ > 0 ,
(1.5)

where q1, { a n ,n1} and { b n ,n1} are sequences of positive real numbers, and { X n i ,1in,n1} is an array of random variables satisfying Marcinkiewicz-Zygmund and Rosenthal type inequalities. When q=1, similar results were established by Sung [10]. From our results, we can easily obtain the results of Chen and Wang [7] and Wu et al. [9].

2 Main results

In this section, we give sets of sufficient conditions for complete q th moment convergence (1.5). The following theorem gives sufficient conditions under the assumption that the array { X n i ,1in,n1} satisfies a Marcinkiewicz-Zygmund type inequality.

Theorem 2.1 Let 1q<2 and let { X n i ,1in,n1} be an array of random variables with E X n i =0 and E | X n i | q < for 1in and n1. Let { a n ,n1} and { b n ,n1} be sequences of positive real numbers. Suppose that the following conditions hold:

  1. (i)

    for some s (1q<s2), there exists a positive function α s (x) such that

    (2.1)

where X n i (x)= X n i I(| X n i | x 1 / q )+ x 1 / q I( X n i > x 1 / q ) x 1 / q I( X n i < x 1 / q ),

  1. (ii)

    n = 1 b n a n s α s (n) i = 1 n E | X n i | s I(| X n i | a n )<,

  2. (iii)

    n = 1 b n a n q (1+ α s (n)) i = 1 n E | X n i | q I(| X n i |> a n )<,

  3. (iv)

    i = 1 n E| X n i |I(| X n i |> a n )/ a n 0.

Then (1.5) holds.

Proof It is obvious that

n = 1 b n a n q E ( max 1 k n | i = 1 k X n i | ϵ a n ) + q = n = 1 b n a n q 0 P ( max 1 k n | i = 1 k X n i | > ϵ a n + x 1 / q ) d x n = 1 b n a n q { 0 a n q P ( max 1 k n | i = 1 k X n i | > ϵ a n ) d x + a n q P ( max 1 k n | i = 1 k X n i | > x 1 / q ) d x } : = I 1 + I 2 .

We first show that I 1 <. For 1in and n1, define

X n i = X n i I ( | X n i | a n ) + a n I( X n i > a n ) a n I( X n i < a n ), X n i = X n i X n i .

Then we have by E X n i =0, Markov’s inequality, and (i) that

P ( max 1 k n | i = 1 k X n i | > ϵ a n ) = P ( max 1 k n | i = 1 k ( X n i E X n i + X n i E X n i ) | > ϵ a n ) P ( max 1 k n | i = 1 k ( X n i E X n i ) | > ϵ a n / 2 ) + P ( max 1 k n | i = 1 k ( X n i E X n i ) | > ϵ a n / 2 ) 2 s ϵ s a n s E max 1 k n | i = 1 k ( X n i E X n i ) | s + 2 ϵ 1 a n 1 E max 1 k n | i = 1 k ( X n i E X n i ) | 2 s ϵ s a n s α s ( n ) i = 1 n E | X n i | s + 4 ϵ 1 a n 1 i = 1 n E | X n i | 2 s ϵ s a n s α s ( n ) i = 1 n ( E | X n i | s I ( | X n i | a n ) + a n s P ( | X n i | > a n ) ) + 4 ϵ 1 a n 1 i = 1 n E | X n i | I ( | X n i | > a n ) 2 s ϵ s a n s α s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + 2 s ϵ s a n q α s ( n ) i = 1 n E | X n i | q I ( | X n i | > a n ) + 4 ϵ 1 a n q i = 1 n E | X n i | q I ( | X n i | > a n ) .

It follows that

I 1 = n = 1 b n P ( max 1 k n | i = 1 k X n i | > ϵ a n ) 2 s ϵ s n = 1 b n a n s α s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + n = 1 b n a n q ( 2 s ϵ s α s ( n ) + 4 ϵ 1 ) i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence I 1 < by (ii) and (iii).

We next show that I 2 <. By the definition of X n i (x), we have that

P ( max 1 k n | i = 1 k X n i | > x 1 / q ) i = 1 n P ( | X n i | > x 1 / q ) + P ( max 1 k n | i = 1 k X n i ( x ) | > x 1 / q ) .

We also have by E X n i =0 and (iv) that

sup x a n q max 1 k n x 1 / q | i = 1 k E X n i ( x ) | = sup x a n q max 1 k n x 1 / q | i = 1 k E ( X n i X n i ( x ) ) | sup x a n q x 1 / q i = 1 n E | X n i | I ( | X n i | > x 1 / q ) a n 1 i = 1 n E | X n i | I ( | X n i | > a n ) 0 .

Hence to prove that I 2 <, it suffices to show that

I 3 : = n = 1 b n a n q i = 1 n a n q P ( | X n i | > x 1 / q ) d x < , I 4 : = n = 1 b n a n q a n q P ( max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | > x 1 / q / 2 ) d x < .

If x> a n q , then P(| X n i |> x 1 / q )=P(| X n i |I(| X n i |> a n )> x 1 / q ) and so

a n q P ( | X n i | > x 1 / q ) d x = a n q P ( | X n i | I ( | X n i | > a n ) > x 1 / q ) d x 0 P ( | X n i | I ( | X n i | > a n ) > x 1 / q ) d x = E | X n i | q I ( | X n i | > a n ) ,

which implies that

I 3 n = 1 b n a n q i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence I 3 < by (iii).

Finally, we show that I 4 <. We get by Markov’s inequality and (i) that

I 4 2 s n = 1 b n a n q a n q x s / q E max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | s d x 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q x s / q E | X n i ( x ) | s d x = 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q x s / q ( E | X n i | s I ( | X n i | x 1 / q ) + x s / q P ( | X n i | > x 1 / q ) ) d x = 2 s n = 1 b n a n q α s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) a n q x s / q d x + 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q x s / q E | X n i | s I ( a n < | X n i | x 1 / q ) d x + 2 s n = 1 b n a n q α s ( n ) i = 1 n a n q P ( | X n i | > x 1 / q ) d x : = I 5 + I 6 + I 7 .

Using a simple integral and Fubini’s theorem, we obtain that

Similarly to I 3 ,

I 7 2 s n = 1 b n a n q α s (n) i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence I 4 < by (ii) and (iii). □

The next theorem gives sufficient conditions for complete q th moment convergence (1.5) under the assumption that the array { X n i ,1in,n1} satisfies a Rosenthal type inequality.

Theorem 2.2 Let q1 and let { X n i ,1in,n1} be an array of random variables with E X n i =0 and E | X n i | q < for 1in and n1. Let { a n ,n1} and { b n ,n1} be sequences of positive real numbers. Suppose that the following conditions hold:

  1. (i)

    for some s>max{2,2q/r} (r is the same as in (v)), there exist positive functions β s (x) and γ s (x) such that

    (2.2)

where X n i (x)= X n i I(| X n i | x 1 / q )+ x 1 / q I( X n i > x 1 / q ) x 1 / q I( X n i < x 1 / q ),

  1. (ii)

    n = 1 b n a n s β s (n) i = 1 n E | X n i | s I(| X n i | a n )<,

  2. (iii)

    n = 1 b n a n q (1+ β s (n)) i = 1 n E | X n i | q I(| X n i |> a n )<,

  3. (iv)

    i = 1 n E| X n i |I(| X n i |> a n )/ a n 0,

  4. (v)

    n = 1 b n γ s (n) ( i = 1 n a n r E | X n i | r ) s / 2 < for some 0<r2.

Then (1.5) holds.

Proof The proof is similar to that of Theorem 2.1. As in the proof of Theorem 2.1,

n = 1 b n a n q E ( max 1 k n | i = 1 k X n i | ϵ a n ) + q n = 1 b n a n q { 0 a n q P ( max 1 k n | i = 1 k X n i | > ϵ a n ) d x + a n q P ( max 1 k n | i = 1 k X n i | > x 1 / q ) d x } : = J 1 + J 2 .

Similarly to I 1 in the proof of Theorem 2.1, we have by the c r -inequality that

P ( max 1 k n | i = 1 k X n i | > ϵ a n ) 2 s ϵ s a n s β s ( n ) i = 1 n ( E | X n i | s I ( | X n i | a n ) + a n s P ( | X n i | > a n ) ) + 2 s ϵ s a n s γ s ( n ) ( i = 1 n E | X n i | 2 I ( | X n i | a n ) + a n 2 P ( | X n i | > a n ) ) s / 2 + 4 ϵ 1 a n 1 i = 1 n E | X n i | I ( | X n i | > a n ) 2 s ϵ s a n s β s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + ( 2 s ϵ s β s ( n ) + 4 ϵ 1 ) a n q i = 1 n E | X n i | q I ( | X n i | > a n ) + 2 3 s / 2 1 ϵ s γ s ( n ) ( i = 1 n a n r E | X n i | r I ( | X n i | a n ) ) s / 2 + 2 3 s / 2 1 ϵ s γ s ( n ) ( i = 1 n a n r E | X n i | r I ( | X n i | > a n ) ) s / 2 .

Hence J 1 < by (ii), (iii), and (v).

As in the proof of Theorem 2.1, to prove that J 2 <, it suffices to show that

J 3 : = n = 1 b n a n q i = 1 n a n q P ( | X n i | > x 1 / q ) d x < , J 4 : = n = 1 b n a n q a n q P ( max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | > x 1 / q / 2 ) d x < .

The proof of J 3 < is same as that of I 3 in the proof of Theorem 2.1.

For J 4 , we have by Markov’s inequality and (i) that

J 4 2 s n = 1 b n a n q a n q x s / q E max 1 k n | i = 1 k ( X n i ( x ) E X n i ( x ) ) | s d x 2 s n = 1 b n a n q a n q x s / q { β s ( n ) i = 1 n E | X n i ( x ) | s + γ s ( n ) ( i = 1 n E | X n i ( x ) | 2 ) s / 2 } d x : = J 5 + J 6 .

Similarly to I 4 in the proof of Theorem 2.1, we get that

J 5 2 s q s q n = 1 b n a n s β s ( n ) i = 1 n E | X n i | s I ( | X n i | a n ) + 2 s ( q s q + 1 ) n = 1 b n a n q β s ( n ) i = 1 n E | X n i | q I ( | X n i | > a n ) .

Hence J 5 < by (ii) and (iii).

Finally, we show that J 6 <. By the c r -inequality,

J 6 = 2 s n = 1 b n a n q γ s ( n ) a n q x s / q ( i = 1 n E | X n i | 2 I ( | X n i | x 1 / q ) + x 2 / q P ( | X n i | > x 1 / q ) ) s / 2 d x 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q x s / q ( i = 1 n E | X n i | 2 I ( | X n i | x 1 / q ) ) s / 2 d x + 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q ( i = 1 n P ( | X n i | > x 1 / q ) ) s / 2 d x 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q x s / q ( i = 1 n E | X n i | r x ( 2 r ) / q ) s / 2 d x + 2 3 s / 2 1 n = 1 b n a n q γ s ( n ) a n q ( i = 1 n x r / q E | X n i | r ) s / 2 d x = 2 3 s / 2 2 q r s 2 q n = 1 b n γ s ( n ) ( i = 1 n a n r E | X n i | r ) s / 2 .

Hence J 6 < by (v). □

Remark 2.1 Marcinkiewicz-Zygmund and Rosenthal type inequalities hold for dependent random variables as well as independent random variables.

  1. (1)

    Let { X n i ,1in,n1} be an array of rowwise negatively associated random variables. Then, for 1<s2, (2.1) holds for α s (n)= 2 s 2 3 s =8. For s>2, (2.2) holds for β s (n)= 2 s 2 ( 15 s / log s ) s and γ s (n)=2 ( 15 s / log s ) s (see Shao [11]). Note that α s (n) and β s (n) are multiplied by the factor 2 s since E | X n i ( x ) E X n i ( x ) | s 2 s E | X n i ( x ) | s .

  2. (2)

    Let { X n i ,1in,n1} be an array of rowwise negatively orthant dependent random variables. By Corollary 2.2 of Asadian et al. [12] and Theorem 3 of Móricz [13], (2.1) holds for α s (n)= C 1 ( log n ) s , and (2.2) holds for β s (n)= C 2 ( log n ) s and γ s (n)= C 2 ( log n ) s , where C 1 and C 2 are constants depending only on s.

  3. (3)

    Let { X n ,n1} be a sequence of identically distributed φ-mixing random variables. Set X n i = X i for 1in and n1. By Shao’s [14] result, (2.2) holds for a constant function β s (x) and a slowly varying function γ s (x). In particular, if n = 1 φ 1 / 2 ( 2 n )<, then (2.2) holds for some constant functions β s (x) and γ s (x).

  4. (4)

    Let { X n ,n1} be a sequence of identically distributed ρ-mixing random variables. Set X n i = X i for 1in and n1. By Shao’s [15] result, (2.2) holds for some slowly varying functions β s (x) and γ s (x). In particular, if n = 1 ρ 2 / s ( 2 n )<, then (2.2) holds for some constant functions β s (x) and γ s (x).

  5. (5)

    Let { X n ,n1} be a sequence of ρ -mixing random variables. Set X n i = X i for 1in and n1. By the result of Utev and Peligrad [16], (2.2) holds for some constant functions β s (x) and γ s (x).

3 Corollaries

In this section, we establish some complete q th moment convergence results by using the results obtained in the previous section.

Corollary 3.1 (Chen and Wang [7])

Let { X n ,n1} be a sequence of identically distributed φ-mixing random variables with E X 1 =0, and let t1, 0<p<2, q1, and pt1. Assume that (1.3) holds. Furthermore, suppose that

n = 1 φ 1 / 2 ( 2 n ) <

if t=1 and max{q,pt}<2. Then

n = 1 n t 2 q / p E ( max 1 k n | i = 1 k X i | ϵ n 1 / p ) + q <for all ϵ>0.

Proof Let a n = n 1 / p and b n = n t 2 for n1, and let X n i = X i for 1in and n1. Then, for s2, (2.2) holds for a constant function β s (x) and a slowly varying function γ s (x) (see Remark 2.1(3)). Under the additional condition that n = 1 φ 1 / 2 ( 2 n )<, (2.2) holds for some constant functions β s (x) and γ s (x). In particular, for s=2, (2.1) holds for a constant function α s (x) under this additional condition.

By a standard method, we have that

n = 1 n t 1 s / p E | X 1 | s I ( | X 1 | n 1 / p ) C E | X 1 | p t if  p t < s , n = 1 n t 1 q / p E | X 1 | q I ( | X 1 | > n 1 / p ) { C E | X 1 | q if  q > p t , C E | X 1 | p t log ( 1 + | X 1 | ) if  q = p t , C E | X 1 | p t if  q < p t , n 1 1 / p E | X 1 | I ( | X 1 | > n 1 / p ) n 1 t E | X 1 | p t I ( | X 1 | > n 1 / p ) if  p t 1 ,

where C is a positive constant which is not necessarily the same one in each appearance. Hence, the conditions (i)-(iv) of Theorem 2.2 hold if we take s>max{pt,2,2q/r}. Under the additional conditions that max{q,pt}<2 and n = 1 φ 1 / 2 ( 2 n )<, all conditions of Theorem 2.1 hold if we take s=2. Therefore, the result follows from Theorems 2.1 and 2.2 if we only show that the condition (v) of Theorem 2.2 holds when t>1 or max{q,pt}2. To do this, we take r=2 if max{q,pt}2 and r=max{q,pt} if max{q,pt}<2. If t>1 or max{q,pt}2, then r>p and so we can choose s>2 large enough such that t1+(1r/p)s/2<0. Then

n = 1 b n γ s (n) ( i = 1 n a n r E | X n i | r ) s / 2 = ( E | X 1 | r ) s / 2 n = 1 γ s (n) n t 2 + ( 1 r / p ) s / 2 <.

Hence the condition (v) of Theorem 2.2 holds. □

Let { Ψ n (x),n1} be a sequence of positive even functions satisfying

Ψ n ( | x | ) | x | q and Ψ n ( | x | ) | x | s as  | x |
(3.1)

for some 1q<s.

Corollary 3.2 Let { Ψ n (x),n1} be a sequence of positive even functions satisfying (3.1) for some 1q<s2. Let { X n i ,1in,n1} be an array of random variables satisfying E X n i =0 for 1in and n1, and (2.1) for some constant function α s (x). Let { a n ,n1} and { b n ,n1} be sequences of positive real numbers. Suppose that the following conditions hold:

  1. (i)

    n = 1 b n i = 1 n E Ψ i (| X n i |)/ Ψ i ( a n )<,

  2. (ii)

    i = 1 n E Ψ i (| X n i |)/ Ψ i ( a n )0.

Then (1.5) holds.

Proof First note by Ψ i (|x|)/ | x | q that Ψ i (|x|) is an increasing function. Since Ψ i (|x|)/ | x | s ,

| X n i | s I ( | X n i | a n ) a n s Ψ i ( | X n i | I ( | X n i | a n ) ) Ψ i ( a n ) Ψ i ( | X n i | ) Ψ i ( a n ) .

Since q1 and Ψ i (|x|)/ | x | q ,

| X n i | I ( | X n i | > a n ) a n | X n i | q I ( | X n i | > a n ) a n q Ψ i ( | X n i | I ( | X n i | > a n ) ) Ψ i ( a n ) Ψ i ( | X n i | ) Ψ i ( a n ) .

It follows that all conditions of Theorem 2.1 are satisfied and so the result follows from Theorem 2.1. □

Corollary 3.3 Let { Ψ n (x),n1} be a sequence of positive even functions satisfying (3.1) for some q1 and s>max{2,q}. Let { X n i ,1in,n1} be an array of random variables satisfying E X n i =0 for 1in and n1, and (2.2) for some constant functions β s (x) and γ s (x). Let { a n ,n1} and { b n ,n1} be sequences of positive real numbers. Suppose that the following conditions hold:

  1. (i)

    n = 1 b n i = 1 n E Ψ i (| X n i |)/ Ψ i ( a n )<,

  2. (ii)

    i = 1 n E Ψ i (| X n i |)/ Ψ i ( a n )0,

  3. (iii)

    n = 1 b n ( i = 1 n a n 2 E | X n i | 2 ) s / 2 <.

Then (1.5) holds.

Proof The proof is similar to that of Corollary 3.2. By the proof of Corollary 3.2 and the condition (iii), all conditions of Theorem 2.2 are satisfied and so the result follows from Theorem 2.2. □

Remark 3.1 When b n =1 for n1, the condition (i) of Corollaries 3.2 and 3.3 is reduced to the condition n = 1 i = 1 n E Ψ i (| X n i |)/ Ψ i ( a n )<, and so the condition (ii) of Corollaries 3.2 and 3.3 follows from this reduced condition. For a sequence of ρ -mixing random variables, (2.1) holds for some constant function α s (x) if s=2, and (2.2) holds for some constant functions β s (x) and γ s (x) if s>2 (see Remark 2.1(5)). Wu et al. [9] proved Corollaries 3.2 and 3.3 when b n =1 for n1, and { X n i } is an array of rowwise ρ -mixing random variables.