1 Introduction

Throughout this paper, \((X,\|\cdot \|)\) will denote a normed space. A sequence \((x_{n})\subset X\) is said to be strong Cesàro convergent to L if \(\lim_{n\to \infty}\frac{1}{n}\sum_{k=1}^{n}\|x_{k}-L\|=0\). The strong Cesàro convergence for real numbers was introduced by Hardy-Littlewood [15] and Fekete [14] in connection with the convergence of the Fourier series (see [25] for historical notes and the most recent monograph [6]).

A sequence \((x_{n})\subset X\) is said to be statistically convergent to L if for any ε the subset \(\{n : \|x_{n}-L\|>\varepsilon \}\) has zero density on \(\mathbb{N}\). The term statistical convergence was firstly presented by Fast [13] and Steinhaus [23] independently in the same year 1951. Actually, a root of the notion of statistical convergence can be detected in [26], where the author used the term almost convergence, which turned out to be equivalent to the concept of statistical convergence.

Interest in statistical convergence arises from its important applications in Approximation Theory. For instance, Korovkin-type approximation results have been obtained in [4, 5, 12, 18, 20] for different types of statistical convergence. Furthermore, Korovkin-type results for double sequences are especially interesting (see the results of H. Aktuğlu [3] and the pioneering result of K. Demirci et al. [11]).

The strong Cesàro convergence, and the statistical convergence, two notions introduced at different times and different contexts, are related by J. Connor’s result [10], which was sharpened by Khan and Orhan [16]. Among other results, Khan and Orhan [16] show that a sequence is strongly Cesàro convergent if and only if it is statistically convergent and uniformly integrable. Recently in [17], the authors obtained Connor-Khan-Orhan’s (see [10, 16]) result for different kinds of statistical convergence defined by moduli.

In [19], Mursaleen and Edely obtained Connor’s result for double sequences. We stress here the difficulty of obtaining these results in several variables. First of all, in two dimensions, the convergent sequences are not necessarily bounded. Moreover, there are different methods to define the limit for double sequences.

In this paper, we aim to obtain the results in [17] for double sequences and for different types of statistical convergence, which are defined by a density in \(\mathbb{N}\) using a modulus function f.

Let us recall that \(f : \mathbb{R}^{+}\to \mathbb{R}^{+}\) is said to be a modulus function if it satisfies:

  1. 1.

    \(f(x)=0 \) if and only if \(x=0\).

  2. 2.

    \(f(x+y)\leq f(x)+f(y)\) for every \(x,y\in \mathbb{R}^{+}\).

  3. 3.

    f is increasing.

  4. 4.

    f is continuous from the right at 0.

The results in [17] are summarized as follows. It was shown that the f-statistical convergence and the f-strong Cesàro convergence are not always equivalents for any modulus function f. In [17], it was shown that for any modulus function f, a f-strong Cesàro convergent sequence is always f-statistically convergent. The converse of this result is not true even for bounded sequences; then, it was characterized analytically by the modulus functions f for which the converse is true for bounded sequences. It was also proved that these modulus functions are those for which the statistically convergent sequences are f-statistically convergent.

Since the Mursaleen-Edely result [19], a research program has been developed to explore different types of convergence for double sequences and their interconnection (see [7, 8]). For instance, there are efforts to understand different types of convergence (lacunary statistical, f-lacunary statistical of order α) for double sequences and to see their interconnections with the Cesàro-type convergence for double sequences (see [9, 21, 22, 24].

The difficulty of this paper relies on the different ways to define the convergence for double sequences and the different manners that could exist to define the concept of f-statistical convergence for double sequences. Moreover, the computations that appear in [17] become much more complicated for double sequences and the statistical convergence defined by a modulus function f. At first glance, the concept of f-statistical convergence for double sequences (introduced in [1]) seems too artificial. However, as we will see, the concept is defined perfectly because it guarantees the regularity of the method.

In Sect. 2, we analyze the problem and see the difficulties, and we will show that when f is a compatible modulus function, an equivalent reformulation of f-statistical convergence for double sequences is possible, which is easier to use, and it simplifies the computations.

In Sect. 3, we will introduce the concept of f-strong Cesàro convergence for double sequences. As we will see, this concept adapts like a glove to f-statistical convergence for double sequences. With this notion in hand, we will establish that if a double sequence is f-strong Cesàro convergent, then it is f-statistically convergent to the same limit. We show that the converse of this result is not true even for bounded sequences; however, we characterized the modulus functions f for which the converse of the result is true. We will see also in Sect. 3 that the equivalence between f-strong Cesàro convergence and strong Cesàro convergence for double sequences is only true for compatible modulus functions. Moreover, under the hypothesis of compatibility on f, we will show that a double sequence is f-statistically convergent if and only if it is statistically convergent.

2 Preliminary results

Throughout the paper, we denote by #A the cardinality of a finite set A. Every double limit we use will be considered in Pringsheim’s sense unless otherwise stated. The f-strong Cesàro convergence for double sequences is defined as follows:

Definition 2.1

Let f be the modulus function. A sequence \((x_{mn})\) is said to be f-strong Cesàro convergent to L if

$$ \lim_{m,n\to \infty} \frac{f (\sum_{i=1}^{m}\sum_{j=1}^{n} \Vert x_{ij}-L \Vert )}{f(mn)}=0. $$

Let us observe that if f is bounded, then the constant sequence \(x_{nm}=L\) is the only sequence that is f-strong convergent to L. Indeed, let us denote \(\|f\|_{\infty}=\sup_{n}|f(n)|\), and let us observe that if for some \((i,j) \in \mathbb{N} \times \mathbb{N}\), \(\|x_{ij}-L\|=c> 0\) then

$$ \frac{f(c)}{ \Vert f \Vert _{\infty}}\leq \frac{f (\sum_{i=1}^{m}\sum_{j=1}^{n} \Vert x_{ij}-L \Vert )}{f(mn)}, $$

which gives the desired result. For this reason, from now on, we will assume that if f is a modulus function, then f is unbounded.

In [2], by means of a new concept of density of a subset of \(\mathbb{N}\), it is defined the following non-matrix concept of convergence. A sequence \((x_{n})\) is said to be f-statistically convergent to L if for every \(\varepsilon >0\),

$$ \lim_{n\to \infty} \frac{f(\#\{ k\leq n : \Vert x_{k}-L \Vert >\varepsilon \})}{f(n)}=0. $$

At first glance, one might think that the natural extension of the above definition for two variables can be stated with the following notion of the density of two variables:

Given a modulus function f. A subset \(A\subseteq \mathbb{N}\times \mathbb{N}\) has f-density if the following limit exists

$$ d_{2,f}(A)= \lim_{m,n\to \infty} \frac{f(\#\{ (i,j)\in A, i\leq n \text{ and } j\leq m\})}{f(nm)}. $$

Given \((x_{ij})\) and L, we denote

$$ A_{\varepsilon}(m,n)=\bigl\{ (i,j), i\leq n\text{ and } j\leq m, \Vert x_{ij} - L \Vert >\varepsilon \bigr\} .$$

Based on the above notion of density, we could define the f-statistically convergence as follows:

Let f be a modulus function. Then we say that a sequence \((x_{ij})\)f-converges” statistically to L if for any \(\varepsilon >0\),

$$ \lim_{m,n\to \infty} \frac{f (\# A_{\varepsilon}(m,n) )}{f(nm)}=0. $$
(1)

However, as it was pointed out in [1] with the above definition of f-density on the hand, there are modulus functions f for which \(\mathbb{N}\times \mathbb{N}\) has no density 1. For this reason, it was necessary to define a normalization factor for the f-statistical convergence of double sequences. Let \((x_{ij})\) be a double sequence and \(L\in X\), for any \(\varepsilon >0\), \(p\leq m\) and \(q\leq n\) (\(p,q,m,n\in \mathbb{N}\)), let us define the subsets \(A_{\varepsilon}(p,q,m,n)\):

$$ A_{\varepsilon}(p,q,m,n)= \bigl\{ (i,j) \in \mathbb{N} \times \mathbb{N} : p \leq i\leq m , q\leq j\leq n , \Vert x_{ij}-L \Vert > \varepsilon \bigr\} . $$

Definition 2.2

Let f be a modulus function. Then \((x_{ij})\) is f-statistically convergent to L if

$$ \lim_{p,q\to \infty}\lim_{m,n\to \infty} \frac{f(\# A_{\varepsilon}(p,q,m,n))}{f(mn)}=0. $$

Let us point out that, in general, the above limits (in Pringsheim’s sense) may not exist. Next, we will see that for a compatible modulus function f, it is not necessary to introduce the above normalization factor, and Definition 2.2 is equivalent to the definition given by equation (1). Let us recall what a compatible modulus function is.

Definition 2.3

A modulus function f is said to be compatible if for any \(\varepsilon >0\), there exist \(\varepsilon '>0\) and \(n_{0}(\varepsilon )\in \mathbb{N}\) such that \(\frac{f(n\varepsilon ')}{f(n)}<\varepsilon \) for all \(n\geq n_{0}\).

Example 2.4

The following functions \(f(x)=x^{p}+x^{q}\), \(0< p,q\leq 1\), \(f(x)=x^{p}+\log (x+1)\), \(f(x)=x+\frac{x}{x+1}\) are modulus functions, which are compatibles. And \(f(x)=\log (x+1)\), \(f(x)=W(x)\) (where W is the W-Lambert function restricted to \(\mathbb{R}^{+}\), that is, the inverse of \(xe^{x}\)) are modulus functions, which are not compatibles. Indeed, the function \(f(x)=x+\log (x+1)\) is compatible since

$$ \lim_{n\to \infty}\frac{f(n\varepsilon ')}{f(n)}=\lim_{n\to \infty} \frac{n\varepsilon '+\log (1+n\varepsilon ')}{n+\log (n+1)}= \varepsilon '. $$

On the other hand, if \(f(x)=\log (x+1)\), since

$$ \lim_{n\to \infty}\frac{\log (1+n\varepsilon ')}{\log (1+n)}=1, $$

we obtain that \(f(x)=\log (x+1)\) is not compatible.

Theorem 2.5

Let f be a compatible modulus function, then \((x_{ij})\) is f-statistically convergent to L if and only if for any \(\varepsilon >0\)

$$ \lim_{m,n\to \infty} \frac{f (\# A_{\varepsilon}(m,n) )}{f(nm)}=0. $$

Proof

One implication follows directly without the compatibility’s hypothesis on f. Let us observe that \(A_{\varepsilon}(p,q,m,n)\subseteq A_{\varepsilon}(m,n)\); hence, since f is increasing

$$ f\bigl(\# A_{\varepsilon}(p,q,m,n)\bigr)\leq f\bigl(\# A_{\varepsilon}(m,n)\bigr), $$

the implication follows dividing the above inequality by \(f(mn)\) and taking limits as \((m,n)\to \infty \).

Conversely, let \(\varepsilon >0\). Since \((x_{ij})\) is f-statistically convergent to L, there exist \((p,q)\in \mathbb{N}\times \mathbb{N}\) and \(N\in \mathbb{N}\) such that, if \(m\geq N>p\) and \(n\geq N>q\), we have that

$$ \frac{f(\# A_{\varepsilon}(p,q,m,n))}{f(mn)}< \frac{\varepsilon}{2}. $$
(2)

Since f is compatible, there exist \(\varepsilon '>0\) and \(n_{0}>N\) large enough such that \(\frac{f(n\varepsilon ')}{f(n)}<\frac{\varepsilon}{2}\), for all \(n\geq n_{0}\). We can take N large enough such that

$$ \frac{p}{m}+\frac{q}{n}< \varepsilon ' $$

for all \(n, m\geq N\). Let us denote by \(B_{\varepsilon}(m,n)=A_{\varepsilon}(m,n)\setminus A_{\varepsilon}(p,q,m,n)\).

We claim that for all \(m,n\geq N\)

$$ \frac{f(\# B_{\varepsilon}(m,n))}{f(mn)}< \frac{\varepsilon}{2}. $$
(3)

Indeed, we observe that

$$ \# B_{\varepsilon}(m,n) \leq \# A_{\varepsilon}(p,n) + \# A_{\varepsilon}(m,q) \leq pn+qm. $$

Therefore, by dividing the above inequality by mn, we obtain

$$ \frac{\# B_{\varepsilon}(m,n)}{mn}\leq \frac{p}{m}+\frac{q}{n}< \varepsilon ' $$

for all \(m,n\geq N\). Thus, since f is increasing

$$ f\bigl(\# B_{\varepsilon}(m,n)\bigr)\leq f\bigl(\varepsilon ' mn\bigr), $$

which implies that

$$ \frac{f(\# B_{\varepsilon}(m,n))}{f(mn)}\leq \frac{f(\varepsilon ' mn)}{f(mn)}< \frac{\varepsilon}{2}, $$

the last inequality follows from the compatibility of f.

Finally, the result follows from (2) and (3). Indeed, if \(m,n\geq \max \{N,n_{0}\}\) we obtain:

$$ \frac{f(\# A_{\varepsilon}(m,n))}{f(mn)}\leq \frac{f(\# A_{\varepsilon}(p,q,m,n))}{f(mn)}+ \frac{f(\#B_{\varepsilon}(m,n))}{f(mn)}< \frac{\varepsilon}{2}+ \frac{\varepsilon}{2}=\varepsilon , $$

which yields the desired result. □

Let us recall that f is a compatible modulus function provided \(\lim_{\varepsilon\to 0}\limsup_{n}\frac{f(n\varepsilon)}{f(n)}=0\). We will say that a modulus function f is compatible of second order or 2-compatible, provided \(\lim_{\varepsilon\to 0}\limsup_{n}\frac{f(n\varepsilon)}{f(n^{2})}=0\). Clearly, if f is compatible, then f is 2-compatible. The next result complements Theorem 2.5.

Proposition 2.6

Assume that for any f-statistical convergent double sequence \((x_{i,j})\) we have that for any \(\varepsilon >0\)

$$ \lim_{m,n}\frac{f(\#A_{\varepsilon}(0,0,m,n))}{f(mn)}=0 $$

then f must be 2-compatible.

Proof

Indeed, assume that f is not compatible. Let \(\varepsilon _{n}\) be a decreasing sequence converging to 0. Since f is not compatible, there exists \(c>0\) such that, for each k, there exists \(m_{k}\) such that \(f(m_{k}\varepsilon _{k})>cf(m_{k})\). Moreover, we can select \(m_{k}\) inductively satisfying

$$ 1-\varepsilon _{k+1}-\frac{1}{m_{k+1}}> \frac{(1-\varepsilon _{k})m_{k}}{m_{k+1}}. $$
(4)

Now we use an standard argument used to construct subsets with prescribed densities. Set \(n_{k}=\lfloor m_{k}\varepsilon _{k}\rfloor +1\). And extracting a subsequence if it is necessary, we can assume that \(n_{1}< n_{2}<\cdots \) , \(m_{1}< m_{2}<\cdots \) . Thus, set \(A_{k}=[m_{k+1}-(n_{k+1}-n_{k})]\cap \mathbb{N}\). Condition (4) guarantee that \(A_{k}\subset [m_{k},m_{k+1}]\).

Let us denote \(A=\bigcup_{k}A_{k}\), and \(x_{n}=\chi _{A}(n)\).

An easy check show that the sequence \(x_{1,n}=x_{n}\) is f-statistical convergent to zero, but \(\frac{f(\#A_{\varepsilon}(0,0,m_{k},m_{k}))}{f(m_{k}^{2})}\geq c\) which yields the desired result. □

3 Main results

Let us start this section recalling the concept of f-strong Cesàro convergence for double sequences.

Definition 3.1

Let f be a modulus function. Then a double sequence \((x_{ij})\) is said to be f-strong Cesàro convergent to L if

$$ \lim_{m,n\to \infty} \frac{f (\sum_{i=1}^{n}\sum_{j=1}^{m} \Vert x_{ij}-L \Vert )}{f(mn)}=0. $$

We aim to establish Mursaleen’s result [19] for f-statistical convergence and f-strong Cesàro convergence.

Theorem 3.2

Let f be a modulus function.

  1. (a)

    If \((x_{ij})\) is f-strong Cesàro convergent to L, then \((x_{ij})\) is strong Cesàro convergent.

  2. (b)

    Additionally, if f is compatible, then for every strong Cesàro convergent sequence \((x_{ij})\), we have that \((x_{ij})\) is f-strong Cesàro convergent.

Proof

Let us establish (a). Let \(p\in \mathbb{N}\), since \((x_{ij})\) is f-strong Cesàro convergent to L, there exists \(N_{0}>0\) such that

$$ \frac{f (\sum_{i=1}^{n}\sum_{j=1}^{m} \Vert x_{ij}-L \Vert )}{f(mn)}< \frac{1}{p} $$

for all \(m,n\geq N_{0}\). That is,

$$ f \Biggl(\sum_{i=1}^{n}\sum _{j=1}^{m} \Vert x_{ij}-L \Vert \Biggr)\leq \frac{1}{p}f(mn)\leq f \biggl(\frac{mn}{p} \biggr). $$

To obtain the last inequality, it is enough to observe that the condition \(f(x+y)\leq f(x)+f(y)\) implies \(f(mn)=f(\frac{mn}{p}+\cdots +\frac{mn}{p})\leq p f(\frac{mn}{p})\).

Since f is increasing, we get

$$ \frac{\sum_{i=1}^{n}\sum_{j=1}^{m} \Vert x_{ij}-L \Vert }{mn}< \frac{1}{p} $$

for all \(m,n\geq N_{0}\). That is, the sequence \((x_{ij})\) is strong Cesàro convergent.

To prove (b), let us fix \(\varepsilon >0\) arbitrarily small. Since f is compatible, there exist \(\varepsilon '>0\) and \(n_{0}(\varepsilon ) \in \mathbb{N}\) such that

$$ \frac{f(n\varepsilon ')}{f(n)}< \varepsilon $$

for all \(n\geq n_{0}\). Since \((x_{ij})\) is strong Cesàro convergent to L, there exists \(N_{0} \in \mathbb{N}\) such that

$$ \sum_{i=1}^{n}\sum _{j=1}^{m} \Vert x_{ij}-L \Vert < \varepsilon ' mn $$

for all \(m,n\geq N_{0}\). Now, since f is increasing, dividing by \(f(mn)\), we obtain

$$ \frac{f (\sum_{i=1}^{n}\sum_{j=1}^{m} \Vert x_{ij}-L \Vert )}{f(mn)}< \frac{f(\varepsilon ' mn)}{f(mn)}< \varepsilon $$

for all \(m,n\geq \max \{n_{0}, N_{0}\}\). That is, \((x_{ij})\) is f-strong Cesàro convergent as desired. □

Remark 3.3

Let us observe that the compatibility condition on f is necessary in (b) of Theorem 3.2. Indeed, let us consider the modulus function \(f(x)=\log (1+x)\) and the sequence

$$ x_{ij}= \textstyle\begin{cases} 1 & i=j, \\ 0 & i\neq j \end{cases} $$

then, an easy check shows that \(x_{ij}\) is strong Cesàro convergent to 0; however, \(x_{ij}\) does not converge f-strongly Cesàro to 0.

Given a modulus functions f. It was proved in [1] that if a sequence \((x_{ij})\) is f-statistically convergent to L, then \((x_{ij})\) is statistically convergent to L. Let us remark that the converse of this result follows for compatible modulus function.

Theorem 3.4

Let f be a compatible modulus function. If \((x_{ij})\) is statistically convergent to L, then \((x_{ij})\) is f-statistically convergent to L.

Proof

Let us fix \(\varepsilon >0\). Since f is compatible, there exist \(\varepsilon '>0\) and \(n_{0} \in \mathbb{N}\) such that if \(n\geq n_{0}\) then

$$ \frac{n\varepsilon '}{n}< \varepsilon . $$

Let \(c>0\) and \(\varepsilon '>0\). Since \((x_{ij})\) is statistically convergent to L, then there exists \(N_{0} \in \mathbb{N}\) such that if \(m,n\geq N_{0}\), we have

$$ \frac{\#\{(i,j) : i\leq n, j\leq m, \Vert x_{ij}-L \Vert >c\}}{mn}< \varepsilon '. $$

Since f is increasing, we obtain

$$ \frac{f (\#\{(i,j) : i\leq n, j\leq m, \Vert x_{ij}-L \Vert >c\} )}{f(mn)}< \frac{f(\varepsilon 'mn)}{f(mn)}< \varepsilon $$

for all \(m,n\geq \text{max}\{n_{0},N_{0}\}\). Then, applying Theorem 2.5, we obtain the desired result. □

Now, we are going to see that the hypothesis on compatibility of the modulus function stated in Theorem 3.2(b) and Theorem 3.4 is not a coincidence.

Let us denote \(\lfloor x\rfloor \) the integer part of \(x\in\mathbb{R}\).

Theorem 3.5

Let f be a modulus function, then:

  1. (a)

    If all statistically convergent sequences are f-statistically convergent, then f is compatible.

  2. (b)

    If all strong Cesàro convergent sequences are f-strong Cesàro convergent, then f is compatible.

Proof

If f is not compatible, then there exists \(c>0\) such that \(\limsup_{n}\frac{f(n\varepsilon )}{f(n)}>c\). Assume that \(\varepsilon _{k}\) is a decreasing sequence converging to 0, for each k we can construct inductively an increasing sequence \(m_{k}\) satisfying \(f(m_{k}\varepsilon _{k})>cf(m_{k})\) and

$$ \varepsilon _{k+1}-\frac{m_{k}\varepsilon _{k}-1}{m_{k+1}}< \biggl(1- \sqrt{ \frac{m_{k}}{m_{k+1}}}-\frac{2}{\sqrt{m_{k+1}}} \biggr)^{2}. $$
(5)

Let us define by \(n_{k}=\lfloor m_{k}\varepsilon _{k}\rfloor +1\), and we set \(\ell _{k}=\lfloor \sqrt{m_{k}}\rfloor \). An easy check using (5) yields \((\ell _{k+1}-\ell _{k})^{2}>n_{k+1}-n_{k}\).

Let us fix \(A_{k+1} \subset [\ell _{k+1}-\lfloor \sqrt{n_{k+1}-n_{k}}\rfloor -1] \times [\ell _{k+1}-\lfloor \sqrt{n_{k+1}-n_{k}}\rfloor -1]\) a subset of \(\mathbb{N}\times \mathbb{N}\) such that \(\#(A_{k+1})=n_{k+1}-n_{k}\). Let us denote \(A=\bigcup_{k}A_{k}\) and set \(x_{i,j}=\chi _{A}(i,j)\). Let us see that \(x_{i,j}\) is statistically convergent to 0, but not f-statistically convergent. Indeed, for any \(m, n\in \mathbb{N}\), there exist p and q, such that \(\ell _{p}< m\leq \ell _{p+1}\) and \(\ell _{q}< n\leq \ell _{q+1} \). Set \(k=\min \{p,q\}\). We can suppose without loss that \(k\geq \ell _{k+1}-\lfloor \sqrt{n_{k+1}-n_{k}}\rfloor -1\). Hence, since \((\sqrt{m_{k+1}}-1)^{2}\leq \ell _{k+1}^{2}\leq m_{k}\), for any \(\varepsilon >0\):

$$\begin{aligned} \frac{\#\{(i,j) : i\leq m, j\leq n \text{ and } \vert x_{i,j} \vert >\varepsilon \}}{mn} \leq & \frac{n_{k}}{\ell _{k}^{2}}+ \frac{n_{k+1}-n_{k}}{[\ell _{k+1}-\lfloor \sqrt{n_{k+1}-n_{k}}\rfloor -1 ]^{2}} \\ \leq & \frac{n_{k}}{m_{k}}+ \frac{n_{k+1}-n_{k}}{(\sqrt{m_{k+1}}-\sqrt{n_{k+1}-n_{k}}-2)^{2}} \\ \leq & \frac{n_{k}}{m_{k}}+ \frac{\frac{n_{k+1}-n_{k}}{m_{k+1}}}{(1-\sqrt{\frac{n_{k+1}-n_{k}}{m_{k+1}}}-\frac{2}{\sqrt{m_{k+1}}})^{2}} \end{aligned}$$

which goes to zero as \(r\to \infty \) as desired. On the other hand, if we set

$$ A_{\varepsilon}(p,q,m,n)= \bigl\{ (i,j) \in \mathbb{N} \times \mathbb{N} : p\leq i\leq m , q\leq j\leq n , \vert x_{ij} \vert > \varepsilon \bigr\} , $$

we shall show that there exists ε such that the limit

$$ \lim_{p,q}\lim_{m,n} \frac{f(\#A_{\varepsilon}(p,q,m,n))}{f(mn)} $$

is not zero. Indeed,

$$ \frac{f(\#\{(i,j) :i\leq \ell _{k}, j\leq \ell _{k}, \vert x_{i,j} \vert >1/2\})}{f(\ell _{k}^{2}) } \geq \frac{f(n_{k})}{f(m_{k})}\geq \frac{f(m_{k}\varepsilon _{k})}{f(m_{k})}\geq c, $$

which gives that

$$ \lim_{m,n}\frac{f(\#A_{1/2}(0,0,m,n))}{f(mn)}\geq c. $$

On the other hand, for each p, q there exists \(p'\), \(q'\) such that \(\ell _{p}'\leq p\leq \ell _{p'+1}\) and \(\ell _{q}'\leq q\leq \ell _{q'+1}\). Set \(s=\max \{p',q'\}\). Since \(A_{1/2}(0,0,m,n)\subset A_{1/2}(p,q,m,n)\cup A_{1/2}(s,s,m,n)\), we get that for any \(\delta >0\)

$$ \lim_{m,n}\frac{A(p,q,m,n)}{f(mn)} \geq \lim_{m,n} \frac{f(A_{1/2}(0,0,m,n))}{f(mn)}-\frac{f(n_{s})}{f(m,n)}\geq c- \delta $$

which yields part (a). The part (b) follows using the same ideas. □

Next, we will recall Mursaleen’s result [19], which established a result of J. Connor for double sequences [10].

Theorem 3.6

(Mursaleen [19])

If a double sequence \((x_{ij})\) is strong Cesàro convergent to L, then it is statistically convergent to L. Additionally, if the sequence is bounded, then the converse is also true.

This result connects two concepts quite differently. Sometimes, it is easier to verify that a double sequence is strong Cesàro convergent than to verify that it is statistically convergent. On the other hand, if our sequence is bounded, we do not know if the sequence has a limit, and we want to show that it is strong Cesàro convergent, it is better to check that the sequence is statistically Cauchy because, as it was also proved by Mursaleen [19], the statistically Cauchy sequences are statistically convergent. The above relationship is very helpful. The impetus of this work is to know what the situation is in the context of the f-statistical convergence.

Theorem 3.7

Let \((x_{ij})\) be a double sequence and let f be a modulus function.

  1. (a)

    If \((x_{ij})\) is f-strong Cesàro convergent to L, then \((x_{ij})\) is f-statistically convergent to L.

  2. (b)

    Additionally, if the sequence is bounded, then the converse of the last statement is true if and only if f is a compatible modulus function.

Proof

To establish (a), we show that it is enough to state (6) for all \(r\in \mathbb{N}\),

$$ \lim_{p,q\to \infty}\lim_{m,n\to \infty} \frac{f(\# A_{\frac{1}{r}}(p,q,m,n))}{f(mn)}=0. $$
(6)

Indeed, let \(\varepsilon >0\) be small enough, then there exists \(r\in \mathbb{N}\) such that \(\frac{1}{r+1}\leq \varepsilon <\frac{1}{r}\). Hence, we obtain that for any \(p,q,m,n\in \mathbb{N}\)

$$ A_{\frac{1}{r}}(p,q,m,n) \subseteq A_{\varepsilon}(p,q,m,n)\subseteq A_{ \frac{1}{r+1}}(p,q,m,n), $$

and this implies that

$$ \# A_{\frac{1}{r}}(p,q,m,n) \leq \# A_{\varepsilon}(p,q,m,n) \leq \# A_{ \frac{1}{r+1}}(p,q,m,n). $$

Since f is increasing, dividing by \(f(mn)\), we get

$$ \frac{f(\# A_{\frac{1}{r}}(p,q,m,n))}{f(mn)} \leq \frac{f(\# A_{\varepsilon}(p,q,m,n))}{f(mn)} \leq \frac{f(\#A_{\frac{1}{r+1}}(p,q,m,n))}{f(mn)}, $$

the result follows taking limits. Thus, let \(r\in \mathbb{N}\) be large enough, and we will show that (6) is satisfied. Indeed, let \(p,q,m,n\in \mathbb{N}\) with \(p\leq m\) and \(q\leq n\), then

$$\begin{aligned} f \Biggl(\sum_{i=1}^{m} \sum_{j=1}^{n} \Vert x_{ij}-L \Vert \Biggr) & \geq f \Biggl(\sum _{i=1}^{m}\sum_{ \substack{j=1 \\ \Vert x_{ij}-L \Vert \geq \frac{1}{r}}}^{n} \Vert x_{ij}-L \Vert \Biggr) \end{aligned}$$
(7)
$$\begin{aligned} & \geq f \Biggl(\sum_{i=1}^{n}\sum _{ \substack{j=1 \\ \Vert x_{ij}-L \Vert \geq \frac{1}{r}}}^{n} \frac{1}{r} \Biggr) \geq \frac{1}{r} f \Biggl(\sum_{i=1}^{m} \sum_{ \substack{j=1 \\ (i,j)\in A_{\frac{1}{r}}(m,n)}}^{n} 1 \Biggr) \end{aligned}$$
(8)
$$\begin{aligned} &= \frac{1}{r} f \bigl(\# A_{\frac{1}{r}}(m,n) \bigr) \end{aligned}$$
(9)
$$\begin{aligned} &\geq \frac{1}{r} f \bigl(\# A_{\frac{1}{r}}(p,q,m,n) \bigr). \end{aligned}$$
(10)

Since \((x_{ij})\) is strong Cesàro convergent to L, we have that

$$ \lim_{m,n\to \infty} \frac{f (\sum_{i=1}^{m}\sum_{j=1}^{n} \Vert x_{ij}-L \Vert )}{f(mn)}=0. $$

Therefore, dividing by \(f(mn)\) the inequalities (7), we get that

$$ \lim_{m,n\to \infty} \frac{f (\# A_{\frac{1}{r}}(p,q,m,n) )}{f(mn)}=0 $$

for all \(p,q\in \mathbb{N}\), which gives that the sequence \((x_{ij})\) is f-statistically convergent to L as we desired.

To establish (b), let f be a compatible modulus function, and let us suppose that \((x_{ij})\) is a bounded sequence that is f-statistically convergent to L. Then, we will show that \((x_{ij})\) is f-strong Cesàro convergent. Let \(\varepsilon >0\) be small enough. By Theorem 2.5, since \((x_{ij})\) is f-statistically convergent to L, and f is compatible, we have that for any \(\varepsilon '>0\),

$$ \lim_{m,n\to \infty}\frac{f(\# A_{\varepsilon '}(m,n))}{f(mn)}=0. $$
(11)

Since f is compatible, there exist \(\varepsilon '>0\) and \(n_{0}(\varepsilon ) \in \mathbb{N}\) such that

$$ \frac{f(n\varepsilon ')}{f(n)}< \frac{\varepsilon}{2}, $$
(12)

for all \(n\geq n_{0}\). Let us consider \(M\in \mathbb{N}\) large enough, such that \(\frac{1}{M}<\varepsilon '\) and \(\|x_{ij}\|_{\infty}+\|L\|\leq M\). By equation (11), there exists \(N_{0} \in \mathbb{N}\) such that

$$ \frac{f(\# A_{\varepsilon '}(m,n))}{f(mn)} < \frac{\varepsilon}{2M}, $$

for all \(m,n\geq N_{0}\). And by equation (12), we have that for all \(n\geq n_{0}\),

$$ \frac{f (n\frac{1}{M} )}{f(n)}< \frac{\varepsilon}{2}. $$

Therefore,

$$ \frac{f (\sum_{ij} \Vert x_{ij}-L \Vert )}{f(mn)}\leq \frac{f (\sum_{ \substack{(i,j)\in A_{\varepsilon '}(m,n)}} \Vert x_{ij}-L \Vert )}{f(mn)}+ \frac{f (\sum_{ \substack{(i,j)\notin A_{\varepsilon '}(m,n)}} \Vert x_{ij}-L \Vert )}{f(mn)}. $$
(13)

Since f is increasing, we obtain the following

$$\begin{aligned} \frac{f (\sum_{ \substack{(i,j)\in A_{\varepsilon '}(m,n)}} \Vert x_{ij}-L \Vert )}{f(mn)} \leq &\frac {f (\sum_{ \substack{(i,j)\in A_{\varepsilon '}(m,n)}} M )}{f(mn)} = \frac {f ( \#A_{\varepsilon '}(m,n) M )}{f(mn)} \leq M\frac {\varepsilon}{2M} = \frac {\varepsilon}{2}, \end{aligned}$$

for all \(m,n\geq \max \{N_{0},n_{0}\}\). On the other hand, let us compute the second term of the inequality in (13)

$$ \frac{f (\sum_{ \substack{(i,j)\notin A_{\varepsilon '}(m,n)}} \Vert x_{ij}-L \Vert )}{f(mn)} \leq \frac{f (mn\frac{1}{M} )}{f(mn)} < \frac{\varepsilon}{2} $$

for all \(m,n\geq \max \{N_{0},n_{0}\}\). Thus, we obtain that \((x_{ij})\) is f-strong Cesàro convergent.

If f is not compatible then there exist two sequences \((\varepsilon _{k})\), \((m_{k})\) satisfying \(f(m_{k}\varepsilon _{k})\geq cf(m_{k})\) for some \(c>0\). We set \(\ell _{k}=\lfloor \sqrt{m_{k}}\rfloor \), we can select \(m_{k}\) inductively, such that the sequence

$$ r_{k+1}= \frac{m_{k+1}\varepsilon _{k+1}-m_{k}\varepsilon _{k}}{(\ell _{k+1}-\ell _{k})^{2}} $$

is decreasing and converges to zero. Again it is direct to show that \(x_{i,j}=\sum_{i,j}r_{k+1}\chi _{(\ell _{k},\ell _{k+1}]}(i,j)\) is f-statistically convergent to zero, but not f-strong Cesàro convergent. □