1 Introduction

The theory of probabilistic metric space (PM-space) as a generalization of ordinary metric space was introduced by Menger in [12]. In this space, distribution functions are considered as the distance of a pair of points in statistics rather than deterministic.

The concept of the generalized metric space (briefly G-metric space) was introduced by Mustafa and Sims in 2006 [16]. Then, in 2014, Zhou et al. [26] generalized the notion of PM-space to the G-metric spaces and defined the probabilistic generalized metric space which is denoted by PGM-space.

In [3], Choi et al. proposed a generalization of G-metric space named g-metric space with degree l, in which the distance function with degrees \(l=1,2\) is equivalent to ordinary and G-metric, respectively.

The idea of statistical convergence was first introduced by Steinhaus [25] for real sequences and developed by Fast [7], then reintroduced by Shoenberg [22]. Many authors, such as [4, 6, 8, 9, 17, 21], have discussed and developed this concept. The theory of statistical convergence has many applications in various fields such as approximation theory [5], finitely additive set functions [4], trigonometric series [27], and locally convex spaces [11].

In 2008, Sencimen and Pehlivan [24] introduced the concepts of statistically convergent sequence and statistically Cauchy sequence in the probabilistic metric space endowed with strong topology.

The purpose of this paper is to develop a concept to generalize the probabilistic G-metric space to the probabilistic g-metric space with degree l. Here, the notation of the generalized space is still referred as PGM-space. The l-dimensional asymptotic density of a subset A of \(\mathbb{N}^{l}\) defined previously by the author in [1] is used to introduce the statistically convergent and Cauchy sequences with respect to strong topology, and some basic facts are studied. Note that in this definition \(l=1\) and \(l=2\) values coincide exactly with the statistical convergence in PM-space and PGM-space (related to G-metric), respectively. Thus, the definitions and the obtained results show that this study is more comprehensive.

2 Preliminaries

In this section, some basic definitions and results related to PM-space, PGM-space, and statistical convergence are presented and discussed. First, recall the definition of triangular norm (t-norm) as follows.

Definition 2.1

([23])

A mapping \(T:[0,1]\times [0,1] \to [0,1]\) is called a continuous t-norm if T satisfies the following conditions:

  1. (i)

    T is commutative and associative, i.e., \(T(a,b)=T(b,a)\) and \(T(a,T(b,c))=T(T(a,b),c)\) for all \(a,b,c\in [0,1]\);

  2. (ii)

    T is continuous;

  3. (iii)

    \(T(a,1)=a\) for all \(a\in [0,1]\);

  4. (iv)

    \(T(a,b)\leq T(c,d)\) whenever \(a\leq c\) and \(b\leq d\) for all \(a,b,c,d\in [0,1]\).

A distribution function F is a map from extended reals \(\mathbb{R}_{\infty }:=\mathbb{R}\cup \{-\infty , \infty \}\) into \([0, 1]\) such that it is nondecreasing, left-continuous at every real number, and \(F(-\infty )=0\) and \(F(\infty )=1\). The set of all distribution functions is denoted by Δ and \(\Delta ^{+}:=\{F\in \Delta : F(0)=0\}\).

Definition 2.2

([23])

A Menger probabilistic metric space (PM-space) is a triple \((X,F,T)\), where X is a nonempty set, T is a continuous t-norm. and F is a mapping from \(X\times X\to \Delta ^{+}\) satisfying the following conditions:

(\(F_{(x,y)}\) denotes the value of F at the pair \((x,y)\))

  1. (i)

    \(F_{(x,y)}(t)=1\) for all \(x,y\in X\) and \(t>0\) if and only if \(x=y\);

  2. (ii)

    \(F_{(x,y)}(t)=F_{(y,x)}(t)\);

  3. (iii)

    \(F_{(x,y)}(t+s)\geq T (F_{(x,z)}(t), F_{(z,y)}(s) )\) for all \(x,y,z\in X\) and \(t,s\geq 0\).

Definition 2.3

([16])

Let X be a nonempty set and \(G:X\times X \times X \to \mathbb{R}^{+}\), be a function satisfying:

  1. 1)

    \(G(x,y,z)=0\) if \(x=y=z\);

  2. 2)

    \(0< G(x,x,y)\) for all \(x,y \in X\) with \(x\neq y\);

  3. 3)

    \(G(x,x,y)\leq G(x,y,z)\) for all \(x,y,z\in X\) with \(z\neq y\);

  4. 4)

    \(G(x,y,z)=G(x,z,y)=G(y,z,x)=\cdots\) (symmetry in all three variables);

  5. 5)

    \(G(x,y,z)\leq G(x,a,a)+G(a,y,z)\) for all \(x,y,z,a \in X\).

Then the pair \((X, G)\) is called G-metric space.

The following definition is a developing of PM-space on G-metric.

Definition 2.4

([26])

Menger probabilistic G-metric space (PGM-space) is a triple \((X, G^{*}, T)\), where X is a nonempty set, T is a continuous t-norm, and \(G^{*}\) is a mapping from \(X\times X \times X\) into \(\Delta ^{+}\), satisfying the following conditions:

  1. (i)

    \(G^{*}_{(x,y,z)}(t)=1\) for all \(x, y,z \in X\) and \(t>0\) if and only if \(x=y=z\);

  2. (ii)

    \(G^{*}_{(x,x,y)}(t)\geq G^{*}_{(x,y,z)}(t)\) for all \(x,y\in X\) with \(z\neq y\) and \(t>0\);

  3. (iii)

    \(G^{*}_{(x,y,z)}(t)=G^{*}_{(x,z,y)}(t)=G^{*}_{(y,x,z)}(t)=\cdots\) (symmetry in all three variables);

  4. (iv)

    \(G^{*}_{(x,y,z)}(t+s)\geq T(G^{*}_{(x,a,a)}(t), G^{*}_{(a,y,z)}(s))\) for all \(x,y,z,a\in X\) and \(s,t\geq 0\).

Definition 2.5

([26])

Let \((X, G^{*}, T)\) be a PGM-space and \(x_{0}\in X\). For \(\epsilon >0\) and \(0<\delta <1\), the \((\epsilon , \delta )\)-neighborhood of \(x_{0}\) is defined as follows:

$$ N_{x_{0}}(\epsilon , \delta )= \bigl\{ y\in X: G^{*}_{(x_{0}, y,y)}( \epsilon )>1-\delta , G^{*}_{(y, x_{0},x_{0})}(\epsilon )>1-\delta \bigr\} . $$

Definition 2.6

([26])

  1. (i)

    A sequence \(\{x_{n}\}\) in a PGM-space \((X, G^{*}, T)\) is said to be convergent to a point \(x\in X\) if, for every \(\epsilon >0\) and \(0<\delta <1\), there exists a positive integer \(M_{\epsilon ,\delta }\) such that \(x_{n}\in N_{x}(\epsilon , \delta )\) whenever \(n>M_{\epsilon ,\delta }\).

  2. (ii)

    A sequence \(\{x_{n}\}\) in a PGM-space \((X, G^{*}, T)\) is called a Cauchy sequence if, for every \(\epsilon >0\) and \(0<\delta <1\), there exists a positive integer \(M_{\epsilon ,\delta }\) such that \(G^{*}_{(x_{m}, x_{n},x_{l})}(\epsilon )>1-\delta \) whenever \(m, n, l>M_{\epsilon ,\delta }\).

  3. (iii)

    PGM-space \((X, G^{*}, T)\) is said to be complete if every Cauchy sequence in X converges to a point in X.

In the following, some basic concepts of statistical convergence are discussed.

Definition 2.7

([7])

Let \(A\subset \mathbb{N}\) and \(A(n)=\{k\in A ; k\leq n\}\). Then the asymptotic density of A is defined as follows:

$$ \delta (A)=\lim_{n\to \infty }\frac{ \vert A(n) \vert }{n}. $$

For a subset A of \(\mathbb{N}\), if \(\delta (A)=1\), then it is said to be statistically dense. It is clear that \(\delta (\mathbb{N}-A)=1-\delta (A)\).

Definition 2.8

([7])

A sequence \(\{x_{n}\}\) in \(\mathbb{R}\) is said to be statistically convergent to a point x in \(\mathbb{R}\) if, for each \(\epsilon >0\),

$$ \lim_{n\to \infty }\frac{1}{n} \bigl\vert \bigl\{ k\leq n: \vert x_{k}-x \vert \geq \epsilon \bigr\} \bigr\vert =0. $$

For more information about statistical convergence, the references [2, 4, 710, 1315, 1820] can be addressed.

3 Main results

In this section the main definitions and results are introduced and discussed. First of all, consider the following definition which is a generalization of a G-metric space to an l-dimensional case, where \(l\in \mathbb{N}\).

Definition 3.1

([3])

Let X be a nonempty set. A function \(g:X^{l+1}\longrightarrow \mathbb{R}_{+}\) is called a g-metric with degree l on X if it satisfies the following conditions:

  1. g1)

    \(g(x_{0}, x_{1},\ldots, x_{l})=0 \) if and only if \(x_{0}=x_{1}=\cdots=x_{l}\),

  2. g2)

    \(g(x_{0}, x_{1},\ldots, x_{l})=g(x_{\sigma (0)}, x_{\sigma (1)},\ldots,\ldots,x_{ \sigma (l)})\) for permutation σ on \(\{0, 1,\ldots, l\}\),

  3. g3)

    \(g(x_{0}, x_{1},\ldots, x_{l})\leq g(y_{0}, y_{1},\ldots, y_{l})\) for all \((x_{0}, x_{1},\ldots, x_{l}), (y_{0}, y_{1},\ldots, y_{l})\in X^{l+1}\) with \(\{x_{i}: i=0, 1,\ldots, l\}\subseteq \{y_{i}: i=0, 1,\ldots, l\}\),

  4. g4)

    For all \(x_{0}, x_{1},\ldots, x_{s}, y_{0}, y_{1},\ldots, y_{t}, w\in X\) with \(s+t+1=l\),

    $$ g(x_{0}, x_{1},\ldots, x_{s}, y_{0}, y_{1},\ldots, y_{t})\leq g(x_{0}, x_{1},\ldots, x_{s}, w, w,\ldots, w)+g(y_{0}, y_{1},\ldots, y_{t}, w, w,\ldots, w). $$

The pair \((X,g)\) is called a g-metric space. It is noteworthy that, if \(l=1\) (resp. \(l=2\)), then it is equivalent to an ordinary metric space (resp. G-metric space).

Definition 3.2

([3])

Let \((X,g)\) be a g-metric space, \(x\in X\) be a point, and \(\{x_{k}\}\subseteq X\) be a sequence.

  1. 1)

    \(\{x_{k}\}\) g-converges to x if for all \(\epsilon >0\) there exists \(N\in \mathbb{N}\) such that

    $$ i_{1},\ldots, i_{l}\geq N\quad \Longrightarrow \quad g(x, x_{1},\ldots, x_{l})< \epsilon . $$
  2. 2)

    \(\{x_{k}\}\) is said to be g-Cauchy if for all \(\epsilon >0\) there exists \(N\in \mathbb{N}\) such that

    $$ i_{0}, i_{1},\ldots, i_{l}\geq N\quad \Longrightarrow\quad g(x_{i_{0}}, x_{i_{1}},\ldots, x_{i_{l}})< \epsilon . $$
  3. 3)

    \((X, g)\) is complete if every g-Cauchy sequence in \((X, g)\) is g-convergent in \((X, g)\).

Now, by equipping Definition 2.4 with g-metric, we introduce the following definition that is a generalization.

Definition 3.3

A Menger probabilistic g-metric space (still is denoted as PGM-space) is a triple \((X, F, T)\), where X is a nonempty set, T is a continuous t-norm, and F is a mapping from \(X^{l+1}\) into \(\Delta ^{+}\), satisfying the following conditions:

  1. (i)

    \(F_{(x_{0}, x_{1},\ldots, x_{l})}(t)=1\) for all \(x_{0}, x_{1},\ldots, x_{l} \in X\) and \(t>0\) if and only if \(x_{0}= x_{1}= \cdots= x_{l}\);

  2. (ii)

    \(F_{(x_{0}, x_{1},\ldots, x_{l})}(t)\geq F_{(y_{0}, y_{1},\ldots, y_{l})}(t)\) for all \((x_{0}, x_{1},\ldots, x_{l}), (y_{0}, y_{1},\ldots, y_{l})\in X^{l+1}\) with \(\{x_{i}: i=0, 1,\ldots, l\}\subseteq \{y_{i}: i=0, 1,\ldots, l\}\);

  3. (iii)

    \(F_{(x_{0}, x_{1},\ldots, x_{l})}(t)=F_{(x_{\sigma (0)}, x_{\sigma (1)},\ldots,\ldots,x_{\sigma (l)})}(t)\) for permutation σ on \(\{0, 1,\ldots, l\}\);

  4. (iv)

    For all \(x_{0}, x_{1},\ldots, x_{m}, y_{0}, y_{1},\ldots, y_{n}, w\in X\) with \(m+n+1=l\),

    $$ F_{(x_{0}, x_{1},\ldots, x_{m}, y_{0}, y_{1},\ldots, y_{n})}(t+s)\geq T\bigl(F_{(x_{0}, x_{1},\ldots, x_{m}, w, w,\ldots, w)}(t), F_{(y_{0}, y_{1},\ldots, y_{n}, w, w,\ldots, w)}(s)\bigr). $$

In the following, according to the generalization of asymptotic density given in [1], statistically convergent and Cauchy sequences in a PGM-space are introduced.

Definition 3.4

Let \((X, F,T)\) be a PGM-space. For any \(\epsilon >0\), \(0<\delta <1\) and \(x\in X\), the strong \((\epsilon , \delta )\)-vicinity of x is defined by the subset \(M_{x}(\epsilon , \delta )\) of \(X^{l}\) as follows:

$$ M_{x}(\epsilon , \delta )=\bigl\{ (x_{1}, x_{2}, \ldots, x_{l})\in X^{l}; F_{(x, x_{1}, x_{2},\ldots, x_{l})}(\epsilon )>1-\delta \bigr\} . $$

Next, we generalize the concept of asymptotic density of a set in an l-dimensional case.

Definition 3.5

Let \(K\subset \mathbb{N}^{l}\), the l-dimensional asymptotic density of K is defined by

$$ \delta _{l}(K)=\lim_{n\to \infty }\frac{l!}{n^{l}} \bigl\vert \bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in K; i_{1}, i_{2},\ldots, i_{l} \leq n\bigr\} \bigr\vert . $$

Definition 3.6

Let \((X, F,T)\) be a PGM-space.

  1. (i)

    A sequence \(\{x_{n}\}\) in X is statistically convergent to a point x in X w.r.t. strong topology if, for any \(\epsilon >0\) and \(0<\delta <1\),

    $$ \delta _{l}\bigl(\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}: F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )\leq 1- \delta \bigr\} \bigr)=0, $$

    and is denoted by \(x_{n} \overset{st}{\longrightarrow } x\) or \(st-\lim_{n\to \infty }x_{n}=x\).

  2. (ii)

    \(\{x_{n}\}\) is said to be statistically Cauchy w.r.t. strong topology if, for all \(\epsilon >0\) and \(0<\delta <1\), there exists \(i_{\epsilon }\in \mathbb{N}\) such that

    $$ \delta _{l}\bigl(\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}: F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{\epsilon }})}(\epsilon )\leq 1- \delta \bigr\} \bigr)=0. $$

We can restate part \((i)\) of the above definition as follows:

(\(i^{\prime }\)):

\(x_{n} \overset{st}{\longrightarrow } x\) if and only if, for any \(\epsilon >0\) and \(0<\delta <1\),

$$ \delta _{l}\bigl(\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}: (x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}})\notin M_{x}(\epsilon , \delta )\bigr\} \bigr)=0. $$

The following theorem shows that if a sequence is statistically convergent to a point in X, then that point is unique.

Theorem 3.7

Let \(\{x_{n}\}\) be a sequence in a PGM-space \((X, F, T)\) such that \(x_{n} \overset{st}{\longrightarrow } x\) and \(x_{n} \overset{st}{\longrightarrow } y\), then \(x=y\).

Proof

Let \(\epsilon >0\) and \(0<\delta <1\), by the continuity of T, there exists \(0<\delta _{0}<1\) such that

$$ T (1-\delta _{0}, 1-\delta _{0} )>1-\delta . $$

Set

$$\begin{aligned}& A(\epsilon ,\delta ):=\biggl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}: F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}\biggl( \frac{\epsilon }{2}\biggr)\leq 1-\delta _{0} \biggr\} , \\& B(\epsilon ,\delta ):=\biggl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}: F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, y)}\biggl( \frac{\epsilon }{2}\biggr)\leq 1-\delta _{0} \biggr\} , \end{aligned}$$

and

$$ C(\epsilon ,\delta ):=A(\epsilon ,\delta )\cup B(\epsilon ,\delta ). $$

Since \(x_{n} \overset{st}{\longrightarrow } x\) and \(x_{n} \overset{st}{\longrightarrow } y\), so \(\delta _{l}(A(\epsilon ,\delta ))=\delta _{l}(B(\epsilon ,\delta ))=0\) and hence \(\delta _{l}(C(\epsilon ,\delta ))=0\), therefore \(\delta _{l}(C^{c}(\epsilon ,\delta ))=1\). Suppose \((i_{1}, i_{2},\ldots, i_{l})\in C^{c}(\epsilon ,\delta )\), then by parts (ii) of Definition 3.3 and (iv) of Definition 2.1 we have

$$\begin{aligned} F_{(x, y, y,\ldots, y)}(\epsilon )&\geq T \biggl(F_{(x_{i_{1}}, x_{i_{1}},\ldots, x_{i_{1}}, x)}\biggl( \frac{\epsilon }{2}\biggr), F_{(x_{i_{1}}, y, y,\ldots, y)}\biggl( \frac{\epsilon }{2}\biggr) \biggr) \\ &\geq T \biggl(F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}\biggl( \frac{\epsilon }{2}\biggr), F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, y)}\biggl( \frac{\epsilon }{2}\biggr) \biggr) \\ &> T (1-\delta _{0}, 1-\delta _{0} ) \\ &>1-\delta . \end{aligned}$$

Since \(\delta >0\) is arbitrary, we conclude that \(F_{(x, y, y,\ldots, y)}(\epsilon )=1\), and therefore \(x=y\). □

Theorem 3.8

Every convergent sequence in a PGM-space is statistically convergent.

Proof

Let \(\{x_{n}\}\) be a sequence in the PGM-space \((X,F, T)\) that converges to a point \(x\in X\). For \(\epsilon >0\) and \(0<\delta <1\), there exists \(n_{0}\in \mathbb{N}\) such that, for all \(i_{1}, i_{2},\ldots, i_{l}\geq n_{0}\),

$$ F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )>1-\delta . $$

Set

$$ A(n):=\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}: i_{1}, i_{2},\ldots, i_{l}\leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )>1- \delta \bigr\} , $$

then

$$ \bigl\vert A(n) \bigr\vert \geq \binom{n-n_{0}}{l} $$

and

$$ \lim_{n\to \infty }\frac{l! \vert A(n) \vert }{n^{l}}\geq \lim_{n\to \infty } \frac{l!}{n^{l}} \binom{n-n_{0}}{l}=1, $$

so

$$ st-\lim_{n\to \infty } x_{n}=x. $$

 □

Example 3.9 shows that the converse of the above theorem is not valid.

Example 3.9

Let \(X=\mathbb{R}\) and \(G:\mathbb{R}\times \mathbb{R}\times \mathbb{R}\longrightarrow \mathbb{R}^{+}\) be a G-metric on \(\mathbb{R}\) defined by

$$ G(x,y,z)= \vert x-y \vert + \vert x-z \vert + \vert y-z \vert . $$

\((T= \min )\) Define a function \(F:\mathbb{R}\times \mathbb{R}\times \mathbb{R}\longrightarrow \mathbb{R}^{+}\) as follows:

$$ F_{(x,y,z)}(t)= \textstyle\begin{cases} H(t), & x=y=z, \\ \mathcal{D}{(\frac{t}{G(x,y,z)})}, & \text{otherwise,} \end{cases} $$

where \(H(t)\) and \(\mathcal{D}(t)\) are distribution functions as follows:

$$ H(t)= \textstyle\begin{cases} 0, & t\leq 0, \\ t, & t>0. \end{cases}\displaystyle ,\quad\quad \mathcal{D}= \textstyle\begin{cases} 0, & t\leq 0, \\ 1-e^{-t}, & t>0. \end{cases} $$

Now, consider the following sequence in \(\mathbb{R}\):

$$ x_{n}= \textstyle\begin{cases} n , & n \text{ is square,} \\ 1 , & \text{otherwise.} \end{cases} $$

It is clear that \(\{x_{n}\}\) statistically converges to 1 but it is not convergent normally.

Definition 3.10

A set \(A=\{n_{k}: k\in \mathbb{N}\}\) is said to be statistically dense in \(\mathbb{N}\) if the set

$$ A(n)=\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in A^{l}, i_{1}, i_{2},\ldots, i_{l} \leq n\bigr\} $$

has asymptotic density 1, i.e.,

$$ \delta _{l}(A)=\lim_{n\to \infty }\frac{l! \vert A(n) \vert }{n^{l}}=1. $$

Theorem 3.11

Let \(\{x_{n}\}\) be a sequence in the PGM-space \((X, F, T)\). Then the following are equivalent:

  1. (i)

    \(\{x_{n}\}\) statistically converges to a point \(x \in X\).

  2. (ii)

    There is a sequence \(\{y_{n}\}\) in X such that \(x_{n}=y_{n}\) for almost all n, and \(\{y_{n}\}\) converges to x.

  3. (iii)

    There is a statistically dense subsequence \(\{x_{n_{k}}\}\) of \(\{x_{n}\}\) such that \(\{x_{n_{k}}\}\) is convergent.

  4. (iv)

    There is a statistically dense subsequence \(\{x_{n_{k}}\}\) of \(\{x_{n}\}\) such that \(\{x_{n_{k}}\}\) is statistically convergent.

Proof

\((i\Longrightarrow \mathit{ii})\) Let \(\{x_{n}\}\) be a sequence that converges to x, so

$$\begin{aligned} &\delta _{l}\bigl(\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}: F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )> 1- \delta \bigr\} \bigr) \\ &\quad =\lim_{n\to \infty }\frac{l!}{n^{l}}\bigl| \bigl\{ (i_{1}, i_{2},\ldots, i_{l}) \in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )> 1-\delta \bigr\} \bigr|=1. \end{aligned}$$

For each \(k\in \mathbb{N}\), we can choose an increasing sequence \(\{n_{k}\}\) such that, for every \(n>n_{k}\),

$$ \frac{l!}{n^{l}} \biggl\vert \biggl\{ (i_{1}, i_{2}, \ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}( \epsilon )> 1- \frac{1}{2^{k}}\biggr\} \biggr\vert >1-\frac{1}{2^{k}}. $$

Define the sequence \(\{y_{n}\}\) as follows:

$$ y_{m}= \textstyle\begin{cases} x_{m}, & 1\leq m\leq n_{1}, \\ x_{m}, & n_{k}< m\leq n_{k+1}, i_{1}, i_{2},\ldots, i_{l-1}\leq n_{k+1}, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l-1}}, x_{m}, x)}( \epsilon )> 1-\frac{1}{2^{k}}, \\ x, & \text{otherwise}. \end{cases} $$

Choose \(k\in \mathbb{N}\) such that \(\frac{1}{2^{k}}<\delta \). It is clear that \(\{y_{m}\}\) converges to x. Fix \(n\in \mathbb{N}\), for \(n_{k}< n\leq n_{k+1}\), we have

$$\begin{aligned} \begin{aligned} &\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n ; x_{i_{j}}\neq y_{i_{j}} \bigr\} \\ &\quad \subseteq \bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2}, \ldots, i_{l} \leq n \bigr\} \\ &\quad\quad{} - \biggl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2}, \ldots, i_{l} \leq n_{k} , F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}( \epsilon )> 1- \frac{1}{2^{k}} \biggr\} . \end{aligned} \end{aligned}$$

Hence,

$$\begin{aligned} &\lim_{n\to \infty }\frac{l!}{n^{l}} \bigl\vert \bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n ; x_{i_{j}}\neq y_{i_{j}} \bigr\} \bigr\vert \\ &\quad \leq 1-\lim_{n\to \infty }\frac{l!}{n^{l}} \biggl\vert \biggl\{ (i_{1}, i_{2},\ldots, i_{l}) \in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n_{k} , F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )> 1- \frac{1}{2^{k}} \biggr\} \biggr\vert \\ &\quad < \frac{1}{2^{k}}< \delta , \end{aligned}$$

so

$$\begin{aligned} &\delta _{l}\bigl(\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2}, \ldots, i_{l} \leq n ; x_{i_{j}}\neq y_{i_{j}} \bigr\} \bigr) \\ &\quad =\lim_{n\to \infty }\frac{l!}{n^{l}} \bigl\vert \bigl\{ (i_{1}, i_{2},\ldots, i_{l}) \in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n ; x_{i_{j}} \neq y_{i_{j}} \bigr\} \bigr\vert =0. \end{aligned}$$

\((\mathit{ii} \Longrightarrow \mathit{iii})\) Let \(\{y_{n}\}\) be a convergent sequence in X and \(A=\{n\in \mathbb{N}: y_{n}\neq x_{n}\}\). We have \(\delta _{l}(A)=1\), so the sequence \(\{y_{n}\}\) is a statistical dense subsequence of \(\{x_{n}\}\) that is convergent.

\((\mathit{iii} \Longrightarrow \mathit{iv})\) It is obvious from Theorem 3.8.

\((\mathit{iv} \Longrightarrow i)\) Let \(\{x_{n_{k}}\}\) be a statistically dense subsequence of \(\{x_{n}\}\) that is statistically convergent to a point \(x\in X\). Set \(A=\{n_{k}: k\in \mathbb{N}\}\), so \(\delta _{l}(A)=1\). For ϵ> and \(0<\delta <1\),

$$\begin{aligned} &\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )> 1- \delta \bigr\} \\ &\quad \supseteq \bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{A}^{l}; i_{1}, i_{2}, \ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}( \epsilon )> 1-\delta \bigr\} . \end{aligned}$$

Hence,

$$\begin{aligned} &\lim_{n\to \infty }\frac{l!}{n^{l}} \bigl\vert \bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )> 1-\delta \bigr\} \bigr\vert \\ &\quad \geq \lim_{n\to \infty }\frac{l!}{n^{l}} \bigl\vert \bigl\{ (i_{1}, i_{2},\ldots, i_{l}) \in \mathbb{A}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )> 1-\delta \bigr\} \bigr\vert =1. \end{aligned}$$

So,

$$ \delta _{l}\bigl(\bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2}, \ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}( \epsilon )> 1-\delta \bigr\} \bigr)=1. $$

Therefore \(\{x_{n}\}\) statistically converges to x. □

The following corollary is a direct consequence of the above theorem.

Corollary 3.12

Every statistically convergent sequence in a PGM-space has a convergent subsequence.

Theorem 3.13

Every statistically convergent sequence in a PGM-space is statistically Cauchy.

Proof

Suppose that \(\{x_{n}\}\) is a sequence that statistically converges to a point x. Let \(\epsilon >0\) and \(0<\delta <1\). Since T is continuous, there are \(0<\delta _{1}<1\) and \(0<\delta _{2}<1\) such that \(T(1-\delta _{1}, 1-\delta _{2})>1-\delta \). On the other hand, there exists \(i_{\epsilon }\) such that

$$ F_{(}x_{i_{\epsilon }, x, x,\ldots, x)}\biggl(\frac{\epsilon }{2}\biggr)>1-\delta _{1}. $$

Since

$$ F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )\geq T \biggl(F_{(x_{i_{ \epsilon }}, x,\ldots, x)}\biggl( \frac{\epsilon }{2}\biggr), F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{\epsilon }})}\biggl(\frac{\epsilon }{2}\biggr) \biggr), $$

so

$$\begin{aligned} & \biggl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{ \epsilon }})}\biggl(\frac{\epsilon }{2}\biggr)> 1- \delta _{2} \biggr\} \\ &\quad \subseteq \bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2}, \ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}( \epsilon )> 1-\delta \bigr\} . \end{aligned}$$

Hence

$$\begin{aligned} &\lim_{n\to \infty }\frac{l!}{n^{l}} \biggl\vert \biggl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{\epsilon }})}\biggl(\frac{\epsilon }{2}\biggr)> 1- \delta _{2} \biggr\} \biggr\vert \\ &\quad \leq \lim_{n\to \infty }\frac{l!}{n^{l}} \bigl\vert \bigl\{ (i_{1}, i_{2},\ldots, i_{l})\in \mathbb{N}^{l}; i_{1}, i_{2},\ldots, i_{l} \leq n, F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon )> 1-\delta \bigr\} \bigr\vert . \end{aligned}$$

Since \(\{x_{n}\}\) is statistically convergent, so the right-hand side of the previous inequality is zero. Therefore it shows that the sequence \(\{x_{n}\}\) is statistically Cauchy. □

Definition 3.14

Let \((X, F,T)\) be a PGM-space. If every statistically Cauchy sequence is statistically convergent, then \((X, F,T)\) is said to be statistically complete.

Corollary 3.15

Every statistically complete PGM-space is complete.

Proof

Let \((X,F,T)\) be a statistically complete PGM-space. Suppose that \(\{x_{n}\}\) is a Cauchy sequence in \((X,F,T)\), so it is a statistically Cauchy sequence. Since X is statistically complete, so \(\{x_{n}\}\) is statistically convergent. By Corollary 3.12, there is a subsequence \(\{x_{n_{k}}\}\) of \(\{x_{n}\}\) that converges to a point \(x\in X\). By the continuity of T, for \(0<\delta <1\), there exist \(0<\delta _{1}, \delta _{2}, \delta _{3}, \delta _{4}<1\) such that

$$ \textstyle\begin{cases} T(1-\delta _{1}, 1-\delta _{2})>1-\delta , \\ T(1-\delta _{3}, 1-\delta _{4})>1-\delta _{1}. \end{cases} $$

Let \(\delta _{5}:=\max \{\delta _{2}, \delta _{3}\}\), then we have

$$ T \bigl(T(1-\delta _{5}, 1-\delta _{4}), 1-\delta _{5} \bigr)>1- \delta . $$

For \(\epsilon >0\), since \(\{x_{n}\}\) is Cauchy, then there exist \(N_{1}\in \mathbb{N}\) and \(x_{i_{\epsilon }}\in \{x_{n}\}\) such that, for all \(i_{1}, i_{2},\ldots, i_{l}\geq N_{1}\),

$$ F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{\epsilon }})}\biggl( \frac{\epsilon }{4}\biggr)> 1-\delta _{5}, $$

and since \(x_{n_{k}}\longrightarrow x\), there exists \(N_{2}\geq N_{1}\) such that, for \(i_{n_{1}}, i_{n_{2}},\ldots, i_{n_{l}}\geq N_{2}\),

$$ F_{(x_{i_{n_{1}}}, x_{i_{n_{2}}},\ldots, x_{i_{n_{l}}}, x)}\biggl( \frac{\epsilon }{4}\biggr)> 1-\delta _{5}. $$

For \(i_{1}, i_{2},\ldots, i_{l}, i_{n_{1}}, i_{n_{2}},\ldots, i_{n_{l}}\geq N_{2}\), we have

$$\begin{aligned} &F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x)}(\epsilon ) \\ &\quad \geq T \biggl(F_{(x_{i_{\epsilon }}, x , x,\ldots, x)}\biggl( \frac{\epsilon }{2}\biggr), F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{\epsilon }})}\biggl(\frac{\epsilon }{2}\biggr) \biggr) \\ &\quad \geq T \biggl(T \biggl(F_{(x_{i_{\epsilon }}, x_{i_{n_{1}}}, x_{i_{n_{1}}},\ldots, x_{i_{n_{1}}})}\biggl(\frac{\epsilon }{4}\biggr), F_{(x_{i_{n_{1}}}, x, x,\ldots, x)}\biggl(\frac{\epsilon }{4}\biggr) \biggr), F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{\epsilon }})}\biggl( \frac{\epsilon }{2}\biggr) \biggr) \\ &\quad \geq T \biggl(T \biggl(F_{(x_{i_{\epsilon }}, x_{i_{n_{1}}}, x_{i_{n_{2}}},\ldots, x_{i_{n_{l}}})}\biggl(\frac{\epsilon }{4}\biggr), F_{(x_{i_{n_{1}}}, x_{i_{n_{1}}}, x_{i_{n_{2}}},\ldots, x_{i_{n_{l}}})}\biggl(\frac{\epsilon }{4}\biggr) \biggr), F_{(x_{i_{1}}, x_{i_{2}},\ldots, x_{i_{l}}, x_{i_{\epsilon }})}\biggl( \frac{\epsilon }{4}\biggr) \biggr) \\ &\quad >T \bigl(T(1-\delta _{5}, 1-\delta _{4}), 1-\delta _{5} \bigr) \\ &\quad >1-\delta . \end{aligned}$$

The third inequality arises from part \((\mathit{ii})\) of Definition 3.3 and the nondecreasing property of F. So, \(\{x_{n}\}\) is convergent and therefore \((X,F,T)\) is complete. □