1 Introduction and preliminaries

The concept of statistical summability (H,1), which is a generalization of statistical convergence due to Fast [1], has recently been introduced by Móricz [2]. In this paper we use the idea of logarithmic density to define the concept of logarithmic statistical convergence. We find its relation with statistical convergence and statistical summability (H,1). We further define [ H , 1 ] q -summability and establish some inclusion relations.

Definition 1.1 Let ℕ be the set of all natural numbers and let χ E denote the characteristic function of EN. Put d n (E)= 1 n k = 1 n χ E (k) and δ n (E)= 1 l n k = 1 n χ E ( k ) k for nN, where l n = k = 1 n 1/k (n=1,2,3,). The numbers d ̲ (E)=lim inf n d n (E) and d ¯ (E)=lim sup n d n (E) are called the lower and upper asymptotic density of E, respectively. Similarly, the numbers δ ̲ (E)=lim inf n δ n (E) and δ ¯ (E)=lim sup n δ n (E) are called the lower and upper logarithmic density of E, respectively. If d ̲ (E)= d ¯ (E)=d(E), then d(E) is called the asymptotic density of E ( δ ̲ (E)= δ ¯ (E)= δ ln (E) is called the logarithmic density of E, respectively).

Note that for k=1, l n = k = 1 n 1/k=n and hence δ ln (E) reduces to d(E).

Now recall the concept of statistical convergence of real sequences (see Fast [1] and Fridy [3]).

Definition 1.2 A sequence x=( x k ) is said to be statistically convergent to L if for every ϵ>0, d({k:| x k L|ϵ})=0. That is,

lim n 1 n | { k n : | x k L | ϵ } | =0.

Several extensions, variants and generalizations of this notion have been investigated by various authors, namely [2, 416].

2 Logarithmic statistical convergence

In this section we define the logarithmic statistical convergence and [ H , 1 ] q -summability and establish some inclusion relations.

Definition 2.1 A sequence x=( x k ) is said to be logarithmic statistically convergent to L if for every ϵ>0, the set {k:| x k L|ϵ} has logarithmic density zero. That is,

lim n 1 l n | { k n : 1 k | x k L | ϵ } | =0.
(2.1)

In this case we write st ln -limx=L and we denote the set of all logarithmic statistically convergent sequences by st ln .

Remark 2.1 One can say that logarithmic statistical convergence is a special case of weighted statistical convergence [15] if p k = 1 k . But this is not exactly true, since for p k = 1 k , P n = k = 1 n p k = k = 1 n 1/klogn (n=1,2,3,), and consequently, the definition of weighted statistical convergence gives that lim n 1 l n |{k l n logn: 1 k | x k L|ϵ}|=0. So, one can see the difference between this and (2.1), i.e., in (2.1) the enclosed set has bigger cardinality.

Definition 2.2 Let τ n := l n 1 k = 1 n x k /k, where l n = k = 1 n 1/klogn (n=1,2,3,). We say that x=( x k ) is (H,1)-summable to L if the sequence τ=( τ n ) converges to L, i.e., (H,1)-limx=L.

If k=1 then l n = k = 1 n 1/k=n, and (H,1)-summability is reduced to (C,1)-summability.

Definition 2.3 A sequence x=( x k ) is said to be [ H , 1 ] q -summable (0<q<) to the limit L if lim n 1 l n k = 1 n 1 k | x k L | q =0, and we write it as x k L [ H , 1 ] q . In this case L is called the [ H , 1 ] q -limit of x.

Let q=1. If k=1 then l n = k = 1 n 1/k=n, [ H , 1 ] q -summability is reduced to strong (C,1)-summability. Also, [ H , 1 ] q -summability is a special case of [ N ¯ , p n ] q -summability (cf. [15]) for p k = 1 k .

Recently, Móricz [2] has defined the concept of statistical summability (H,1) as follows.

Definition 2.4 A sequence x=( x k ) is said to be statistically summable (H,1) to L if the sequence τ=( τ n ) is statistically convergent to L, i.e., st-limτ=L=H(st)-limx. We denote by H(st) the set of all sequences which are statistically summable (H,1) and we call such sequences statistically (H,1)-summable sequences.

Remark 2.2 If x=( x k ) is bounded, then st- lim k x k =L implies (C,1)- lim k x k =L (see [17]). The converse is obviously not true, e.g., x=(1,0,1,0,) is (C,1)-summable to 1 2 but not statistically convergent. However, for bounded sequences, statistical convergence to some number is equivalent to strong Cesàro summability to the same number. But for logarithmic statistical convergence the situation is different (see [8]).

Theorem 2.1 Statistical convergence implies logarithmic statistical convergence but converse need not be true.

Proof It is well known that for each EN, d ̲ (E) δ ̲ (E) δ ¯ (E) d ¯ (E) (see [18], pp.70-75, pp.95-96). Hence if d(E) exists, then also δ ln (E) exists and d(E)= δ ln (E). Hence statistical convergence implies logarithmic statistical convergence.

Take E k ={ k k 2 +1, k k 2 +2,, k k 2 + 1 } (kN) and E= k = 2 E k . If E(n)= d n (E) for nN, then

d ¯ (E)lim sup k E ( k k 2 + 1 ) k k 2 + 1 lim sup k k k 2 + 1 k k 2 k k 2 + 1 =1.

Hence d ¯ (E)=1.

Since j E n 1 j =lnk+O( 1 k k 2 ) (kN, k2), we get

δ ¯ (E) lim n k = 1 n ln k + O ( 1 ) j = 1 n n 2 + 1 1 j lim n ln n + O ( 1 ) ( n 2 + 1 ) ln n + O ( 1 ) =0.

Hence δ ln (E)=0 and consequently d ̲ (E)=0, i.e., d(E) does not exist. Define the sequence x=( x k ) by

x k = { 1 if  k E , 0 if  k N E .

Since δ ln (E)=0, we have st ln - lim n x n =0. But (C,1)- lim n x n does not exist because 1 n m = 1 n x m = E ( n ) n (nN) and hence st- lim n x n does not exist.

This completes the proof. □

3 Main results

In the following theorem we establish the relation between logarithmic statistical convergence and Móricz’s statistical summability (H,1).

Theorem 3.1 If a sequence x=( x k ) is bounded and logarithmic statistically convergent to L then it is statistically summable (H,1) to L, but not conversely.

Proof Let x=( x k ) be bounded and logarithmic statistically convergent to L. Write K ϵ :={kN: 1 k | x k L|ϵ}. Then

| τ n L | = | l n 1 k = 1 n x k / k L | = | l n 1 k = 1 n 1 k ( x k L ) | l n 1 k = 1 n 1 k | x k L | l n 1 k K ϵ | x k L | l n 1 ( sup k | x k L | ) K ϵ 0

as n, which implies that τ n L as n. That is, x is (H,1)-summable to L and hence statistically summable (H,1) to L.

For converse, we consider the special case when k=1, then l n =n as above. Consider the sequence x=( x k ) defined by

x k = { 1 if  k  is odd , 0 if  k  is even .

Of course this sequence is not logarithmic statistically convergent. On the other hand, x is (H,1)-summable to 1 and hence statistically summable (H,1) to 1.

This completes the proof of the theorem. □

Remark 3.1 The above theorem is analogous to Theorem 2.1 of [15] but this holds for any bounded sequence.

In the next result we establish the inclusion relation between logarithmic statistical convergence and [ H , 1 ] q -summability.

Theorem 3.2 (a) If 0<q< and a sequence x=( x k ) is [ H , 1 ] q -summable to the limit L, then it is logarithmic statistically convergent to L.

  1. (b)

    If ( x k ) is bounded and logarithmic statistically convergent to L, then x k L [ H , 1 ] q .

Proof (a) If 0<q< and x k L [ H , 1 ] q , then

0 l n 1 k = 1 n 1 k | x k L | q l n 1 k = 1 | x k / k L | ϵ n 1 k | x k L | q ϵ q l n | K ϵ |

as n. That is, lim n 1 l n | K ϵ |=0 and so δ ln ( K ϵ )=0, where K ϵ :={kn: 1 k | x k L|ϵ}. Hence x=( x k ) is logarithmic statistically convergent to L.

  1. (b)

    Suppose that x=( x k ) is bounded and logarithmic statistically convergent to L. Then, for ϵ>0, we have δ ln ( K ϵ )=0. Since x l , there exists M>0 such that | x k L|M (k=1,2,). We have

    l n 1 k = 1 n 1 k | x k L | q = 1 l n k = 1 k K ϵ n 1 k | x k L | q + 1 l n k = 1 k K ϵ n 1 k | x k L | q = S 1 (n)+ S 2 (n),

where

S 1 (n)= 1 l n k = 1 k K ϵ n 1 k | x k L | q and S 2 (n)= 1 l n k = 1 k K ϵ n 1 k | x k L | q .

Now if k K ϵ then S 1 (n)< ϵ q . For k K ϵ , we have

S 2 (n) ( sup | x k L | ) ( | K ϵ | / l n ) M| K ϵ |/ l n 0

as n, since δ ln ( K ϵ )=0. Hence x k L [ H , 1 ] q .

This completes the proof of the theorem. □

Remark 3.2 The above theorem is analogous to Theorem 2.2 of [15] but with less restrictions on the sequence x=( x k ).

In the next result we characterize statistical summability (H,1) through the (H,1)-summable subsequences.

Theorem 3.3 A sequence x=( x k ) is statistically summable (H,1) to L if and only if there exists a set K={ k 1 < k 2 << k n <}N such that δ(K)=1 and (H,1)-lim x k n =L.

Proof Suppose that there exists a set K={ k 1 < k 2 << k n <}N such that δ(K)=1 and (H,1)-lim x k n =L. Then there is a positive integer N such that for n>N,

| τ n L|<ϵ.
(3.1)

Put K ϵ :={nN:| τ k n L|ϵ} and K ={ k N + 1 , k N + 2 ,}. Then δ( K )=1 and K ϵ N K , which implies that δ( K ϵ )=0. Hence x=( x k ) is statistically summable (H,1) to L.

Conversely, let x=( x k ) be statistically summable (H,1) to L. For r=1,2,3, , put K r :={jN:| τ k j L|1/r} and M r :={jN:| τ k j L|<1/r}. Then δ( K r )=0 and

M 1 M 2 M i M i + 1
(3.2)

and

δ( M r )=1,r=1,2,3,.
(3.3)

Now we have to show that for j M r , ( x k j ) is (H,1)-summable to L. Suppose that ( x k j ) is not (H,1)-summable to L. Therefore there is ϵ>0 such that | τ k j L|ϵ for infinitely many terms. Let M ϵ :={jN:| τ k j L|<ϵ} and ϵ>1/r (r=1,2,3,). Then

δ( M ϵ )=0,
(3.4)

and by (3.2), M r M ϵ . Hence δ( M r )=0, which contradicts (3.3) and therefore ( x k j ) is (H,1)-convergent to L.

This completes the proof of the theorem. □

Similarly we can prove the following dual statement.

Theorem 3.4 A sequence x=( x k ) is logarithmic statistically convergent to L if and only if there exists a set K={ k 1 < k 2 << k n <}N such that δ ln (K)=1 and lim x k n n =L.