## 1 Introduction

### Definition 1.1

(Uniformly convex space [13])

A normed linear space E is called uniformly convex if for any $$\epsilon\in(0,2]$$ there exists a $$\delta=\delta(\epsilon)> 0$$ such that if $$x,y \in E$$ with $$\|x\|=1$$, $$\|y\|=1$$ and $$\|x-y\| \geq\epsilon$$ then $$\|\frac{1}{2} (x+y)\| \leq1-\delta$$.

### Definition 1.2

(Modulus of convexity [13])

Let E be a normed linear space with $$\dim E \ge2$$. The modulus of convexity of E is the function $$\delta_{E}:(0,2]\rightarrow [0,1]$$ defined by

$$\delta_{E}(\epsilon):=\inf \biggl\{ 1-\biggl\Vert \frac{x+y}{2} \biggr\Vert :\|x\| \le1, \|y\| \le1; \|x-y\| \ge\epsilon \biggr\} .$$

### Definition 1.3

(Uniformly smooth space [13])

A normed linear space E is said to be uniformly smooth if whenever given $$\epsilon>0$$ there exists $$\delta> 0$$ such that if $$\|x\|=1$$ and $$\|y\| \leq\delta$$ then

$$\|x+y\|+\|x-y\| < 2+\epsilon\|y\|.$$

### Definition 1.4

(Modulus of smoothness [13])

Let E be a normed linear space with $$\dim E \geq2$$. The modulus of smoothness of E is the function $$\rho_{E}:[0,\infty)\rightarrow [0,\infty)$$ defined by

\begin{aligned} \rho_{E}(\tau) :=& \sup \biggl\{ \frac{\|x+y\| +\|x-y\|}{2}-1 : \|x\| =1;\|y\| =\tau \biggr\} \\ =& \sup \biggl\{ \frac{\|x+\tau y\| +\|x-\tau y\|}{2}-1 : \|x\| =1=\|y\| \biggr\} . \end{aligned}

Let K be a nonempty convex subset of a real normed linear space E. For strict contraction self-mappings of K into itself, with a fixed point in K, a well known iterative method ‘the celebrated Picard method’ has successfully been employed to approximate such fixed points. If, however, the domain of a mapping is a proper subset of E (and this is the case in several applications), and it maps K into E, this iteration method may not be well defined. In this situation, for Hilbert spaces and uniformly convex uniformly smooth Banach spaces, this problem has been overcome by the introduction of the metric projection in the recursion formulas (see, for example, [46]).

### Definition 1.5

(Metric projection [1, 4, 5, 7])

Let E be a real uniformly convex and uniformly smooth Banach space, K be a nonempty proper subset of E. The operator $$P_{K}:E\rightarrow K$$ is called a metric projection operator if it assigns to each $$x\in E$$ its nearest point $$\bar{x} \in K$$, which is the solution of the minimization problem

$$P_{K}x=\bar{x};\quad \bar{x}\mbox{: }\Vert x-\bar{x}\Vert = \inf_{\xi\in K}\|x-\xi\|.$$

It is our purpose in this paper to extend the notion of uniform convexity and uniform smoothness of Banach spaces to countably normed spaces. Moreover, by extending some theorems to the case of uniformly convex uniformly smooth countably normed spaces, we prove the existence and uniqueness of nearest points in these spaces. Our theorems generalize some results of [3, 8, 9].

## 2 Preliminaries

### Definition 2.1

(Countably normed space [10, 11])

Two norms $$\|\cdot\|_{1}$$ and $$\|\cdot\|_{2}$$ in a linear space E are said to be compatible if, whenever a sequence $$\{x_{n}\}$$ in E is Cauchy with respect to both norms and converges to a limit $$x \in E$$ with respect to one of them, it also converges to the same limit x with respect to the other norm. A linear space E equipped with a countable system of compatible norms $$\|\cdot\|_{n}$$ is said to be countably normed. One can prove that every countably normed linear space becomes a topological linear space when equipped with the topology generated by the neighborhood base consisting of all sets of the form

$$U_{r, \epsilon} = \bigl\{ x : x \in E ; \Vert x\Vert _{1} < \epsilon, \ldots, \|x\|_{r} < \epsilon\bigr\}$$

for some number $$\epsilon > 0$$ and positive integer r.

### Remark 2.2

([11])

By considering the new norms $$|\!|\!|x |\!|\!|_{n} = \max_{i=1}^{n}\|x\|_{i}$$ we may assume that the sequence of norms $$\{ \|\cdot\|_{n} ; n = 1, 2, \ldots\}$$ is increasing, i.e.,

$$\|x\|_{1} \leq\|x\|_{2} \leq\cdots\leq\|x\|_{n} \leq\cdots,\quad \forall x\in E.$$

If E is a countably normed space, the completion of E in the norm $$\|\cdot\|_{n}$$ is denoted by $$E_{n}$$. Then, by definition, $$E_{n}$$ is a Banach space. Also in the light of Remark 2.2, we can assume that

$$E\subset\cdots\subset E_{n+1} \subset E_{n} \subset\cdots \subset E_{1}.$$

### Proposition 2.3

([10])

Let E be a countably normed space. Then E is complete if and only if $$E = \bigcap_{n = 1}^{\infty} E_{n}$$.

Each Banach space $$E_{n}$$ has a dual, which is a Banach space and denoted by $$E_{n}^{*}$$.

### Proposition 2.4

([10])

The dual of a countably normed space E is given by $$E^{*} = \bigcup_{n = 1}^{\infty} E_{n}^{*}$$ and we have the following inclusions:

$$E_{1}^{*}\subset\cdots\subset E_{n}^{*} \subset E_{n+1}^{*} \subset\cdots \subset E^{*}.$$

Moreover, for $$f\in E_{n}^{*}$$ we have $$\|f\|_{n} \geq\|f\|_{n+1}$$.

### Remark 2.5

A countably normed space is metrizable and its metric d can be defined by $$d(x, y) = \sum_{i=1}^{\infty} \frac{1}{2^{i}} \frac{\| x - y\|_{i}}{1 + \|x - y\|_{i}}$$.

### Example 2.6

An example of a countably normed space is the space of entire functions that are analytic in the unit disc $$|z| < 1$$ with the topology of the uniform convergence on any closed subset of the disc and with the collection of norms $$\|x(z)\|_{n} = \max_{|z|\leq1 - \frac{1}{n}} |x(z)|$$.

### Example 2.7

For $$1 < p < \infty$$, the space $$\ell^{p+0} := \bigcap_{q > p} \ell^{q}$$ is a countably normed space. In fact, one can easily see that $$\ell^{p+0} = \bigcap_{n} \ell^{p_{n}}$$ for any choice of a monotonic decreasing sequence $$\{p_{n}\}$$ converging to p. Using Proposition 2.3 and the fact that $$\ell^{p_{n}}$$ is Banach for every n, it is clear now that the countably normed space $$\ell^{p+0}$$ is complete.

## 3 Main results

In this section, we give new definitions and prove our main theorems.

### Definition 3.1

A countably normed space E is said to be uniformly convex if $$(E_{i}, \|\cdot\|_{i})$$ is uniformly convex for all i, i.e., if for each i, $$\forall \epsilon> 0$$, $$\exists \delta _{i}(\epsilon) >0$$ such that if $$x, y \in E_{i}$$ with $$\|x\|_{i} = 1 = \|y\|_{i}$$ and $$\|x - y\|_{i} \ge\epsilon$$, then $$1 - \|\frac{x + y}{2}\|_{i} > \delta_{i}$$.

If $$\inf\delta_{i} > 0$$, then one may call the space E equi-uniformly convex.

### Definition 3.2

A countably normed space E is said to be uniformly smooth if $$(E_{i}, \|\cdot\|_{i})$$ is uniformly smooth for all i, i.e., if for each i whenever given $$\epsilon>0$$ there exists $$\delta_{i} > 0$$ such that if $$\|x\|_{i} = 1$$ and $$\|y\|_{i} \leq\delta_{i}$$ then

$$\|x+y\|_{i} +\|x-y\|_{i} < 2 + \epsilon\|y\|_{i}.$$

### Proposition 3.3

A countably normed linear space E is uniformly convex if and only if for each i we have $$\delta_{E_{i}} (\epsilon) > 0$$ for all $$\epsilon\in(0,2]$$.

### Proof

Assume that $$(E_{i}, \|\cdot\|_{i})$$ is uniformly convex for all i. Then, for each i, given $$\epsilon> 0$$ there exists $$\delta_{i} > 0$$ such that $$\delta_{i} \le1 - \| \frac{x + y}{2} \|_{i}$$ for every x and y in $$E_{i}$$ such that $$\|x\|_{i} = 1 = \|y\|_{i}$$ and $$\|x - y\|_{i} \ge\epsilon$$. Therefore $$\delta_{E_{i}} (\epsilon) \ge\delta_{i} > 0$$ for all i.

Conversely, assume that for each i, $$\delta_{E_{i}} (\epsilon) > 0$$ for all $$\epsilon \in(0,2]$$. Fix $$\epsilon\in(0,2]$$ and take x, y in $$E_{i}$$ with $$\| x\|_{i} = 1 = \|y\|_{i}$$ and $$\|x - y\|_{i} \ge\epsilon$$, then $$0 < \delta_{E_{i}} (\epsilon) \le1 - \| \frac{x + y}{2} \|_{i}$$ and therefore $$\| \frac{x + y}{2} \|_{i} \le1 - \delta_{i}$$ with $$\delta_{i} = \delta_{E_{i}} (\epsilon)$$ which does not depend on x or y. Then $$(E_{i}, \|\cdot\|_{i})$$ is uniformly convex for all i and hence the countably normed space E is uniformly convex. □

In Proposition 3.3, we showed the conditions of equivalence to uniform convexity of countably normed spaces, and now in Theorem 3.4, we show the conditions of equivalence to uniform smoothness of countably normed spaces.

### Theorem 3.4

A countably normed space E is uniformly smooth if and only if

$$\lim_{t\to0^{+}} \frac{\rho_{E_{i}}(t)}{t} = 0,\quad \forall i.$$

### Proof

Assume that $$(E_{i}, \|\cdot\|_{i})$$ is uniformly smooth for each i and if $$\epsilon> 0$$, then there exists $$\delta_{i} > 0$$ such that $$\frac{\|x + y\|_{i} + \|x - y\|_{i}}{2} - 1 < \frac{\epsilon}{2} \|y\| _{i}$$ for every x, y in $$E_{i}$$ with $$\|x\|_{i} = 1$$ and $$\|y\|_{i} = \delta _{i}$$. This implies that for each i, we have $$\rho_{E_{i}} (t) < \frac {\epsilon}{2} t$$ for every $$t < \delta_{i}$$.

Conversely, for each i, given $$\epsilon> 0$$ suppose that there exists $$\delta_{i} > 0$$ such that $$\frac{\rho_{E_{i}} (t)}{t} < \frac {\epsilon}{2}$$ for every $$t < \delta_{i}$$. Let x, y be in $$E_{i}$$ such that $$\|x\|_{i} = 1$$ and $$\|y\|_{i} = \delta_{i}$$. Then with $$t = \|y\|_{i}$$ we have $$\|x+y\|_{i} +\|x-y\|_{i} < 2 + \epsilon\|y\|_{i}$$. Then $$(E_{i}, \|\cdot\|_{i})$$ is uniformly smooth for all i and hence the countably normed space E is uniformly smooth. □

Now, we prove one of the fundamental links between the Lindenstrauss duality formulas.

### Proposition 3.5

Let E be a countably normed space and for each n, let $$E_{n}$$ be the completion of E in the norm $$\|\cdot\|_{n}$$. Then for each i we have: for every $$\tau> 0$$, $$x \in E_{i}$$, $$\|x\|_{i} = 1$$, and $$x^{*} \in E_{i}^{*}$$ with $$\|x^{*}\|_{i} = 1$$,

$$\rho_{E_{i}}(\tau) = \sup \biggl\{ \frac{\tau\epsilon}{2} - \delta _{E_{i}^{*}}(\epsilon) : 0 < \epsilon\leq2 \biggr\} .$$

### Proof

Let $$\tau> 0$$ and let $$x^{*}, y^{*} \in E_{i}^{*}$$ with $$\|x^{*}\| _{i} = \|y^{*}\|_{i} = 1$$. For any $$\eta> 0$$, from the definition of $$\|\cdot\| _{i}$$ in $$E_{i}^{*}$$ there exist $$x_{0}, y_{0} \in E_{i}$$ with $$\|x_{0}\|_{i} = \|y_{0}\| _{i} = 1$$ such that

$$\bigl\Vert x^{*} + y^{*}\bigr\Vert _{i} - \eta\leq\bigl\langle x_{0}, x^{*} + y^{*} \bigr\rangle _{i},\qquad \bigl\Vert x^{*} - y^{*}\bigr\Vert _{i} - \eta\leq\bigl\langle y_{0}, x^{*} - y^{*} \bigr\rangle _{i}.$$

Using these two inequalities together with the fact that in Banach spaces we have $$\|x\|_{i} = \sup\{ |\langle x, x^{*} \rangle_{i}| : \|x^{*}\| _{i} = 1 \}$$, we have

\begin{aligned}& \bigl\Vert x^{*} + y^{*}\bigr\Vert _{i} + \tau\bigl\Vert x^{*} - y^{*} \bigr\Vert _{i} - 2 \\& \quad \leq \bigl\langle x_{0}, x^{*} + y^{*} \bigr\rangle _{i} + \tau\bigl\langle y_{0}, x^{*} - y^{*} \bigr\rangle _{i} - 2 + \eta (1 + \tau) \\& \quad = \bigl\langle x_{0} + \tau y_{0}, x^{*} \bigr\rangle _{i} + \bigl\langle x_{0} - \tau y_{0}, y^{*} \bigr\rangle _{i} - 2 + \eta (1 + \tau) \\& \quad \leq \Vert x_{0} + \tau y_{0}\Vert _{i} + \Vert x_{0} - \tau y_{0}\Vert _{i} - 2 + \eta (1 + \tau) \\& \quad \leq \sup \bigl\{ \Vert x + \tau y\Vert _{i} + \Vert x - \tau y\Vert _{i} - 2 : \Vert x\Vert _{i} = \Vert y \Vert _{i} = 1 \bigr\} \\& \qquad {}+ \eta (1 + \tau) \\& \quad = 2 \rho_{E_{i}}(\tau) + \eta (1 + \tau). \end{aligned}

If $$0 < \epsilon\leq\|x^{*} - y^{*}\|_{i} \leq2$$, then we have

$$\frac{\tau \epsilon}{2} - \rho_{E_{i}}(\tau) - \frac{\eta}{2} (1 + \tau) \leq1 - \biggl\Vert \frac{x^{*} + y^{*}}{2} \biggr\Vert _{i},$$

which implies that

$$\frac{\tau \epsilon}{2} - \rho_{E_{i}}(\tau) - \frac{\eta}{2} (1 + \tau) \leq\delta_{E_{i}^{*}}(\epsilon).$$

Since η is arbitrary we conclude that

\begin{aligned}& \frac{\tau \epsilon}{2} - \rho_{E_{i}}(\tau) \leq\delta _{E_{i}^{*}}( \epsilon),\quad \forall \epsilon\in(0, 2] \\& \quad \implies \sup \biggl\{ \frac{\tau \epsilon}{2} - \delta_{E_{i}^{*}}( \epsilon ) : \epsilon\in(0, 2] \biggr\} \leq\rho_{E_{i}}(\tau). \end{aligned}

On the other hand, let x, y be in $$E_{i}$$ with $$\|x\|_{i} = \|y\|_{i} = 1$$ and let $$\tau> 0$$. By the Hahn-Banach theorem there exist $$x_{0}^{*}, y_{0}^{*} \in E_{i}^{*}$$ with $$\|x_{0}^{*}\|_{i} = \|y_{0}^{*}\|_{i} = 1$$ and such that

$$\bigl\langle x + \tau y, x_{0}^{*} \bigr\rangle _{i} = \|x + \tau y\|_{i},\qquad \bigl\langle x - \tau y, y_{0}^{*} \bigr\rangle _{i} = \|x - \tau y\|_{i}.$$

Then

\begin{aligned} \Vert x + \tau y\Vert _{i} + \Vert x - \tau y\Vert _{i} - 2 =& \bigl\langle x + \tau y, x_{0}^{*}\bigr\rangle _{i} + \bigl\langle x - \tau y, y_{0}^{*}\bigr\rangle _{i} - 2 \\ =& \bigl\langle x , x_{0}^{*} + y_{0}^{*} \bigr\rangle _{i} + \tau \bigl\langle y , x_{0}^{*} - y_{0}^{*}\bigr\rangle _{i} - 2 \\ \leq& \bigl\Vert x_{0}^{*} + y_{0}^{*}\bigr\Vert _{i} + \tau \bigl\vert \bigl\langle y , x_{0}^{*} - y_{0}^{*}\bigr\rangle _{i} \bigr\vert - 2. \end{aligned}

Hence, if we define $$\epsilon_{0} := |\langle y , x_{0}^{*} - y_{0}^{*}\rangle _{i} |$$, then $$0 < \epsilon_{0} \leq\|x_{0} - y_{0}\|_{i} \leq2$$ and

\begin{aligned} \frac{\|x + \tau y\|_{i} + \|x - \tau y\|_{i}}{2} - 1 \leq& \frac{\| x_{0}^{*} + y_{0}^{*}\|_{i} + \tau |\langle y , x_{0}^{*} - y_{0}^{*}\rangle_{i} |}{2} - 1 \\ =& \frac{\tau \epsilon_{0}}{2} - \biggl( 1 - \frac{\|x_{0}^{*} + y_{0}^{*}\| _{i}}{2} \biggr) \\ \leq& \frac{\tau \epsilon_{0}}{2} - \delta_{E_{i}^{*}}(\epsilon_{0}) \\ \leq& \sup \biggl\{ \frac{\tau \epsilon}{2} - \delta_{E_{i}^{*}}(\epsilon ) : 0 < \epsilon\leq2 \biggr\} . \end{aligned}

Therefore,

$$\rho_{E_{i}}(\tau) \leq\sup \biggl\{ \frac{\tau \epsilon}{2} - \delta _{E_{i}^{*}}(\epsilon) : 0 < \epsilon\leq2 \biggr\} .$$

□

The following two theorems give or determine some duality property concerning uniform convexity and uniform smoothness of countably normed spaces.

### Theorem 3.6

Let E be a countably normed space, then

$$E \textit{ is uniformly smooth } \Longleftrightarrow\ E_{i}^{*} \textit{ is uniformly convex for all } i.$$

### Proof

We will prove both directions by contradiction.

‘⟹’:

Assume that $$(E_{i_{0}}^{*}, \|\cdot\|_{i_{0}})$$ is not uniformly convex for some $$i_{0}$$. Therefore, $$\delta _{E_{i_{0}}^{*}}(\epsilon_{0}) = 0$$ for some $$\epsilon_{0} \in(0, 2]$$. Using Proposition 3.5, we get for any $$\tau > 0$$,

$$0 < \frac{\epsilon_{0}}{2} \leq\frac{\rho_{E_{i_{0}}}(\tau)}{\tau},\quad \mbox{hence } \lim _{\tau} \frac{\rho_{E_{i_{0}}}(\tau)}{\tau} \neq0,$$

which shows that E is not uniformly smooth.

‘⟸’:

Assume that E is not uniformly smooth, then

$$\exists i_{0} \mbox{:} \quad \lim_{t\to0^{+}} \frac{\rho_{E_{i_{0}}(t)}}{t} \neq0,$$

this means that there exists $$\epsilon> 0$$ such that for every $$\delta > 0$$ we can find $$t_{\delta}$$ with $$0 < t_{\delta} < \delta$$ and $$\rho_{E_{i_{0}}}(t_{\delta}) \geq t_{\delta}\epsilon$$. Consequently, one can choose a sequence $$(\tau_{n})$$ such that $$0 < \tau_{n} < 1$$, $$\tau_{n} \rightarrow 0$$, and $$\rho _{E_{i_{0}}}(\tau_{n}) \geq\epsilon \tau_{n} >\frac{\epsilon}{2} \tau_{n}$$. Using Proposition 3.5, for every n there exists $$\epsilon_{n} \in(0, 2]$$ such that

$$\frac{\epsilon}{2} \tau_{n} \leq\frac{\tau_{n} \epsilon_{n}}{2} - \delta _{E_{i_{0}}^{*}}(\epsilon_{n}),$$

which implies

$$0 < \delta_{E_{i_{0}}^{*}}(\epsilon_{n}) \leq\frac{\tau_{n}}{2} ( \epsilon_{n} - \epsilon),$$

in particular $$\epsilon< \epsilon_{n}$$ and $$\delta_{E_{i_{0}}^{*}}(\epsilon _{n})\rightarrow 0$$. Recalling the fact that $$\delta_{E^{*}}$$ is a nondecreasing function we get $$\delta_{E_{i_{0}}^{*}}(\epsilon) \leq\delta _{E_{i_{0}}^{*}}(\epsilon_{n})\rightarrow 0$$. Therefore $$E^{*}$$ is not uniformly convex.

□

The proof of the following theorem is easy.

### Theorem 3.7

Let E be a countably normed space, then

$$E \textit{ is uniformly convex } \Longleftrightarrow\ E_{i}^{*} \textit{ is uniformly smooth for all } i.$$

The following example is a direct application on the previous theorems.

### Example 3.8

(Uniformly convex and uniformly smooth countably normed space)

It is well known that the space $$\ell^{2}=\{(x_{n}):x_{n} \in\mathbb{R}, \forall n\in\mathbb{N},\sum_{n=1}^{\infty} |x_{n}|^{2} <\infty\}$$ with the norm $$\|(x_{n})\|_{2}=\sqrt{\sum_{n=1}^{\infty} |x_{n}|^{2}}$$ is uniformly convex normed space. Besides, it is uniformly smooth, because $$(\ell ^{2})^{*} =\ell^{2}$$.

On $$\ell^{2}$$, we define a countable number of seminorms by $$p_{i, i+1}((x_{n}))=\sqrt{x_{i}^{2}+x_{i+1}^{2}}$$, where $$i=1,3,5,\ldots$$ . Also defining a countable number of compatible norms on $$\ell^{2}$$ by $$\| (x_{n})\|_{i}=p_{i, i+1}((x_{n}))+\|(x_{n})\|_{2}$$. Then $$(\ell^{2}, \{\|\cdot\|_{i}, i=1,3,5,\ldots\})$$ is a countably normed space.

Now, it is clear that $$(\ell^{2}, \{\|\cdot\|_{i}, i=1,3,5,\ldots\})$$ is uniformly smooth (convex) countably normed space, as its completion $$\ell^{2}_{i}=(\ell^{2}, \|\cdot\|_{i})$$ is uniformly smooth (convex) complete normed space for all i.

Moreover, $$\rho_{\ell^{2}_{i}} (t)\leq\rho_{\ell^{2}_{p_{i, i+1}}} (t) + \rho_{\ell^{2}} (t)$$. Then $$\lim_{t\to0^{+}} \frac{\rho_{\ell^{2}}(t)}{t} = 0$$ because $$(\ell^{2}, \|\cdot\|_{2})$$ is uniformly smooth normed space. Also $$\lim_{t\to0^{+}} \frac{\rho_{\ell^{2}_{p_{i, i+1}}}(t)}{t} = 0$$ because $$(\ell^{2}, p_{i, i+1})$$ looks like $$\mathbb{R}^{2}$$ with the Euclidean norm and it is uniformly smooth although $$(\ell^{2}, p_{i, i+1})$$ is seminormed space. Therefore,

$$\lim_{t\to0^{+}} \frac{\rho_{\ell^{2}_{i}}(t)}{t} = 0,\quad \forall i.$$

### Proposition 3.9

Let E be a countably normed space, $$\{x_{n}\}$$ be a sequence in E. Then $$\{x_{n}\}$$ is a Cauchy sequence in $$(E, d)$$ if and only if it is a Cauchy sequence in $$(E, \|\cdot\|_{i})$$ for all i. We have

$$d(x_{n}, x_{m}) = \sum_{i=1}^{\infty} \frac{1}{2^{i}} \frac{\|x_{n} - x_{m}\| _{i}}{1 + \|x_{n} - x_{m}\|_{i}}\rightarrow 0 \Longleftrightarrow \|x_{n} - x_{m}\|_{i} \rightarrow 0,\quad \forall i.$$

### Proof

The first direction is trivial.

Conversely, assume that $$\|x_{n} - x_{m}\|_{i} \rightarrow 0$$, ∀i. Then, for each i, $$\forall \epsilon> 0$$, $$\exists n_{i}$$: $$n, m > n_{i}$$ implies $$\|x_{n} - x_{m}\|_{i} < \frac{\epsilon}{2}$$.

$$\forall \epsilon> 0$$, $$\exists i_{0}$$ such that $$\sum_{i = i_{0} + 1}^{\infty} \frac{1}{2^{i}} < \frac{\epsilon}{2}$$. Let $$n_{0} = \max_{i = 1}^{i_{0}} n_{i}$$, then $$\forall \epsilon>0$$, $$\exists n_{0}$$ such that $$\|x_{n} - x_{m}\|_{i} < \frac{\epsilon}{2}$$, $$\forall i \leq i_{0}$$, $$\forall n,m \geq n_{0}$$. Therefore,

$$n,m \ge n_{0} \implies d(x_{n}, x_{m}) < \sum_{i=1}^{i_{0}} \frac{1}{2^{i}} \frac {\|x_{n} - x_{m}\|_{i}}{1 + \|x_{n} - x_{m}\|_{i}} + \frac{\epsilon}{2} < \frac {\epsilon}{2} + \frac{\epsilon}{2}.$$

□

In the following theorem, we establish one of the most important and interesting geometric property in uniformly convex countably normed spaces, the metric projection point $$\bar{x}$$ ‘the solution of the minimization problem’ is well defined, that is to say, we have ‘existence and uniqueness’ for all norms.

### Theorem 3.10

Let E be a real uniformly convex complete countably normed space, K be a nonempty convex proper subset of E such that K is closed in each $$E_{i}$$. Then the metric projection is well defined on K, i.e.,

$$\forall x \in E\setminus K, \exists! \bar{x}\in K \mbox{: } \|x-\bar{x} \|_{i} = \inf_{\xi\in K}\|x-\xi \|_{i},\quad \forall i.$$

### Claim

Let E be a real uniformly convex Banach space. If $$\{ x_{n}\}$$ is a sequence in E: (a) $$\lim_{n\to\infty} \|x_{n}\| = 1$$ and (b) $$\lim_{n, m\to\infty} \|x_{n} + x_{m}\| = 2$$. Then $$\{x_{n}\}$$ is a Cauchy sequence in $$(E, \|\cdot\|)$$.

### Proof of the claim

Suppose contrarily that the sequence $$\{x_{n}\}$$ is not a Cauchy sequence in $$(E, \|\cdot\|)$$, then

$$\exists \epsilon_{0} \mbox{:}\quad \forall N \in \mathbb{N}, \exists m, n \in\mathbb{N} \mbox{: } m, n > N \mbox{ while } \|x_{n} - x_{m}\| \ge \epsilon_{0}.$$
(1)

$$(E, \|\cdot\|)$$ being uniformly convex implies that

$$\forall \epsilon> 0, \exists \delta> 0 \mbox{: } \forall x, y \in E \mbox{: } \| x\| \le1, \|y\| \le1, \|x - y\| \ge\epsilon \implies \biggl\Vert \frac{x + y}{2}\biggr\Vert \le1 - \delta.$$
(2)

Let $$1 \le M < 2$$ be fixed, then $$\|\frac{x_{n}}{M}\| \rightarrow \frac{1}{M} < 1$$ as $$n\to\infty$$. Then

$$\exists n_{0} \in \mathbb{N} \mbox{: } n,m \ge n_{0} \implies \biggl\Vert \frac{x_{n}}{M}\biggr\Vert \le1 ; \biggl\Vert \frac{x_{m}}{M}\biggr\Vert \le1,$$

now (1) implies $$\|\frac{x_{n} - x_{m}}{M}\| \ge\frac{\epsilon}{M} > \frac{\epsilon}{2}$$, $$\forall n, m \ge n_{0}$$, i.e., $$\forall N\ge n_{0}$$. Hence, (2) implies

$$\biggl\Vert \frac{x_{n} + x_{m}}{2M}\biggr\Vert \le1 - \delta \implies \|x_{n} + x_{m}\| \le (2 - 2\delta)M ,\quad \forall N \ge n_{0}.$$
(3)

As $$N \to\infty\implies n,m \to\infty$$, therefore $$\|x_{n} + x_{m}\| \rightarrow2$$ together with (3), we get $$2 \le(2 - 2\delta)M$$. Without loss of generality, we may assume that $$M \to1$$. Then $$2 \le2 - 2 \delta$$, which is impossible because $$\delta> 0$$. This contradiction finishes the proof of the claim. □

### Proof of Theorem 3.10

Since $$x\notin K$$ and K is closed for each i, we have $$d_{i} := \inf_{\xi\in K}\|x- \xi\|_{i} > 0$$.

That is, there exists a sequence $$\xi_{n}^{i} \in K$$: $$\lim_{n} \|x-\xi_{n}^{i}\|_{i} = d_{i}$$, which implies $$\lim_{n} \|\frac{x-\xi_{n}^{i}}{d_{i}}\|_{i} = 1$$.

$$\mbox{For each } i, u_{n}^{i} := \frac{x-\xi_{n}^{i}}{d_{i}} \in E_{i} \implies \bigl\Vert u_{n}^{i}\bigr\Vert _{i} \rightarrow 1 \quad \mbox{as } n\to\infty.$$
(4)

Since K is convex, then $$\frac{\xi_{n}^{i} + \xi_{m}^{i}}{2} \in K$$, $$\forall \xi_{n}^{i}, \xi_{m}^{i} \in K$$, $$\forall n, m \in\mathbb{N}$$. Then

\begin{aligned}& d_{i} \le\biggl\Vert x - \frac{\xi_{n}^{i} + \xi_{m}^{i}}{2}\biggr\Vert _{i} = \biggl\Vert \frac{x - \xi_{n}^{i}}{2} + \frac{x - \xi_{m}^{i}}{2} \biggr\Vert _{i} \le\biggl\Vert \frac{x - \xi_{n}^{i}}{2}\biggr\Vert _{i} + \biggl\Vert \frac{x - \xi_{m}^{i}}{2}\biggr\Vert _{i} \\& \quad \implies 2 \le\biggl\Vert \frac{x - \xi_{n}^{i}}{d_{i}} + \frac{x - \xi_{m}^{i}}{d_{i}} \biggr\Vert _{i} \le\biggl\Vert \frac{x - \xi_{n}^{i}}{d_{i}}\biggr\Vert _{i} + \biggl\Vert \frac{x - \xi_{m}^{i}}{d_{i}}\biggr\Vert _{i} \\& \quad \implies 2 \le\bigl\Vert u_{n}^{i} + u_{m}^{i}\bigr\Vert _{i} \le\bigl\Vert u_{n}^{i}\bigr\Vert _{i} + \bigl\Vert u_{m}^{i}\bigr\Vert _{i}, \\& \quad \bigl\Vert u_{n}^{i} + u_{m}^{i}\bigr\Vert _{i}\rightarrow 2 \mbox{ as } n, m \to\infty. \end{aligned}
(5)

From (4) and (5), using the claim, then, for each i, $$\{u_{n}^{i}\}$$ is a Cauchy sequence in $$E_{i}$$, i.e., $$\|u_{n}^{i} - u_{m}^{i}\| _{i}\rightarrow 0$$, which gives $$\|\frac{\xi_{n}^{i} - \xi_{m}^{i}}{d_{i}}\|_{i}\rightarrow 0$$. Therefore, $$\{\xi_{n}^{i}\}$$ is a Cauchy sequence in $$K \subset E \subset E_{i}$$. Since $$E_{i}$$ is a complete space for each $$\|\cdot\|_{i}$$, for each i, $$\xi_{n}^{i}\to\xi^{i} \in E_{i}$$ in $$\|\cdot\|_{i}$$ as $$n\to\infty$$. Compatibility of norms implies that the limit $$\xi^{i} \in E_{i}$$ is equal to the limit $$\xi^{j} \in E_{j}$$ for all $$i\neq j$$, i.e., for each i, $$\xi_{n}^{i}\to\bar{x} \in E_{i}$$ in $$\|\cdot\|_{i}$$ as $$n\to\infty$$. Since E is complete, using Proposition 2.3, $$E = \bigcap_{i} E_{i}$$, therefore there exists $$\bar{x} \in E$$ such that $$\xi_{n}^{i}\rightarrow \bar{x} \in E$$. Since K is closed in each $$(E_{i}, \|\cdot\|_{i})$$ and $$\xi_{n}^{i} \in K$$, ∀n, we have $$\bar{x} \in K$$, hence $$x - \xi_{n}^{i}\rightarrow x - \bar{x}$$ in each $$\|\cdot\|_{i}$$. Using the continuity of the $$\|\cdot\|_{i}$$ function, then $$\|x - \xi_{n}^{i} \|_{i} \rightarrow \|x - \bar{x}\|_{i}$$. We know that $$\|x - \xi_{n}^{i} \|_{i} \rightarrow d_{i}$$, so from the uniqueness of the limit, we get $$\|x - \bar{x}\|_{i} = d_{i}$$: $$\bar{x} \in K$$.

Now we prove the uniqueness: Assume that $$x^{*}\in K$$: $$\|x - x^{*}\|_{i} = d_{i}$$ and $$x^{*} \neq\bar{x}$$. Since $$\frac{\bar{x} + x^{*}}{2} \in K$$, because of the convexity of K,

$$d_{i} \le \biggl\Vert x - \frac{\bar{x} + x^{*}}{2}\biggr\Vert _{i} \le\biggl\Vert \frac{x - \bar{x}}{2}\biggr\Vert _{i} + \biggl\Vert \frac{x - x^{*}}{2}\biggr\Vert _{i} = \frac{d_{i}}{2} + \frac{d_{i}}{2} = d_{i},$$

i.e., $$\|x - \frac{\bar{x} + x^{*}}{2}\|_{i} = d_{i}$$ where $$\bar{x} \neq\frac{\bar{x} + x^{*}}{2} \neq x^{*}$$. This contradicts the fact that in a uniformly convex space E, for all $$y_{0} \neq y^{*} \in B(x_{0}, \delta):= \{x : \|x_{0} - x\| \le \delta\}$$, we must have $$\|x_{0} - \frac{y_{0} + y^{*}}{2}\| < \delta$$. □