1 Introduction and the main results

Recently, the existence of solutions of ordinary differential equation with the periodic-integral boundary value conditions has been studied in some articles [16]. In [7] existence and uniqueness of solutions of second order periodic-integrable boundary value problems are discussed by using the lemma on bilinear forms and Schauder’s fixed point theorem. In [8] Cong et al. obtained existence and uniqueness of periodic solutions for \((2n+1)\)th order differential equations. In [9] the existence of solutions has been presented for the following second order differential equation:

$$\bigl(p(t)x'\bigr)'+f(t,x)=0. $$

Based on the above work, the purpose of this paper is to study the following periodic-integrable boundary value problem of the second order differential equations (denoted as PIBVP for short):

$$\begin{aligned} \begin{aligned} &x''=f\bigl(t,x,x' \bigr), \\ &x(0)=x({2\pi}),\qquad { \int_{0}^{2\pi}} x(s)\,ds=0, \end{aligned} \end{aligned}$$
(1)

where \(f:[0,2\pi]\times R^{2}\longrightarrow R\) is continuous. We need the solution of PIBVP (1). To this aim, we introduce the following four assumptions.

Assumption A 1

There exist two continuous functions \(a(t)\) and \(b(t)\), and a nonnegative constant \(M_{1}\), such that

$$ 0\leq a(t)\leq \frac{f(t,x,y)}{x}\leq b(t), $$
(2)

for any \((t,x,y)\) with \(\vert x \vert \geq M_{1}\) and \((t,y)\in[0,2\pi]\times R\).

Assumption A 2

There exist two nonnegative constants \(M_{2}\) and \(M_{3}\) such that

$$ \biggl\vert \frac{f(t,x,y)}{y} \biggr\vert \leq M_{2}, $$
(3)

for any \((t,x)\in[0,2\pi]\times R\) whereas \(\vert y \vert \geq M_{3}\).

Assumption A 3

There exist two continuous functions \(\alpha(t)\) and \(\beta(t)\) such that

$$ 0 \leq\alpha(t)\leq{f_{x}(t,x,y)}\leq\beta(t), $$
(4)

for any \((t,x,y)\in[0,2\pi]\times R^{2}\).

Assumption A 4

There exists a positive integer M, such that, for all \(t \in[0,2\pi]\) and \((x,y)\in R^{2} \),

$$ \bigl\vert {f_{y}(t,x,y)} \bigr\vert \leq M. $$
(5)

We can now state our two main results by the following theorems.

Theorem 1

If Assumptions \(A_{1}\) and \(A_{2}\) hold, then the PIBVP (1) has at least one solution.

Theorem 2

If Assumptions \(A_{3}\) and \(A_{4}\) hold, then the PIBVP (1) has a unique solution.

In Section 2, we introduce two lemmas which will be used in later sections. In Section 3, the linear problem will be discussed by the theory of ordinary differential equation, thus the uniqueness of solutions of linear equations is proved. In Sections 4 and 5, we apply the conclusions in Sections 2 and 3 and Schauder’s fixed point theorem to proving Theorems 1 and 2. In Section 6, as applications of the main results, we introduce two examples.

2 Preliminary

Let us first state some lemmas which will be used in the proof of the main results.

Lemma 1

Let \(x(t)\) be a continuous and differentiable function, and

$$\begin{aligned} x(0)= x(2\pi),\qquad \int_{0}^{2\pi} x(t)\,dt=0. \end{aligned}$$

Then

$$\begin{aligned} \int_{0}^{2\pi} x^{2}(t)\,dt \leq \int_{0}^{2\pi} {\bigl(x'(t) \bigr)}^{2} \,dt. \end{aligned}$$

Proof

Expand \(x(t)\) as a Fourier series and substitute the expressions into the integrals. Thus, the proof is completed. □

Define

$$ g_{0}(t,x,y)= \textstyle\begin{cases} \frac{f(t,x,y)}{x}, &\mbox{$ \vert x \vert \geq M_{1}$},\\ \frac{f(t,M_{1},y)}{M_{1}}, &\mbox{$ 0< x< M_{1}$}, \\ \frac{-f(t,-M_{1},y)}{M_{1}}, &\mbox{$ - M_{1}< x< 0 $}, \\ \frac{a(t)+b(t)}{2}, &\mbox{$ x=0 $}. \end{cases} $$
(6)

From Assumption A 1 , we have \(a\leq g_{0}\leq b\) for all \((t,x,y)\in[0,2\pi]\times R^{2}\). Let

$$ h_{0}(t,x,y) = f(t,x,y)-xg_{0}(t,x,y). $$
(7)

Denote

$$\begin{aligned} \mathcal {O}_{0}=\bigl\{ {(t,x,y)\in[0,2\pi] \times R^{2} | \vert x \vert \geq M_{1}}\bigr\} . \end{aligned}$$

It is easy to see

$$ h_{0}(t,x,y)= \textstyle\begin{cases} 0, &\mbox{$(t,x,y)\in\mathcal {O}_{0}$},\\ f(t,x,y)- \frac{x}{M_{1}}f(t,M_{1},y), &\mbox{$ 0< x< M_{1}$}, \\ f(t,x,y)+ \frac{x}{M_{1}}f(t,-M_{1},y), &\mbox{$ - M_{1}< x< 0 $}, \\ {f(t,0,y)},&\mbox{$ x=0 $}. \end{cases} $$
(8)

Likewise, we define

$$ g_{1}(t,x,y)=\textstyle\begin{cases} \frac{h_{0}(t,x,y)}{y}, &\mbox{$ \vert y \vert \geq M_{3}$},\\ \frac{h_{0}(t,x,M_{3})}{M_{3}}, &\mbox{$ 0< y< M_{3}$}, \\ \frac{-h_{0}(t,x,-M_{3})}{M_{3}}, &\mbox{$ - M_{3}< y< 0 $}, \\ {0}, &\mbox{$ y=0 $}. \end{cases} $$
(9)

From Assumption A 2 and (9), we have \(\vert g_{1} \vert \leq 2M_{2}\) for all \((t,x,y)\in[0,2\pi]\times R^{2}\). Let

$$ h_{1}(t,x,y) = h_{0}(t,x,y)-yg_{1}(t,x,y). $$
(10)

Denote

$$\begin{aligned} \mathcal{O}_{1}=\bigl\{ {(t,x,y)\in[0,2\pi] \times R^{2}} | \vert y \vert \geq M_{3}\bigr\} . \end{aligned}$$

It is obvious that

$$\begin{aligned} &h_{1}(t,x,y) \\ &\quad =\textstyle\begin{cases} 0, &(t,x,y)\in \mathcal{O}_{0}\cup\mathcal{O}_{1},\\ f(t,x,y)- \frac{x}{M_{1}}f(t,M_{1},y) &0\leq x\leq M_{1}\\ \quad{}-\frac{y}{M_{3}}f(t,x,M_{3})+ \frac{xy}{M_{1}M_{3}}f(t,M_{1},M_{3}), &0\leq y\leq M_{3},\\ f(t,x,y)+ \frac{x}{M_{1}}f(t,-M_{1},y) &-M_{1}\leq x\leq0 \\ \quad{}-\frac{y}{M_{3}}f(t,x,M_{3})- \frac{xy}{M_{1}M_{3}}f(t,-M_{1},M_{3}), &0\leq y\leq M_{3},\\ f(t,x,y)- \frac{x}{M_{1}}f(t,M_{1},y) &0\leq x\leq M_{1}\\ \quad{}+\frac{y}{M_{3}}f(t,x,-M_{3})- \frac{xy}{M_{1}M_{3}}f(t,M_{1},-M_{3}), &-M_{3}\leq y\leq0,\\ f(t,x,y)+ \frac{x}{M_{1}}f(t,-M_{1},y) &-M_{1}\leq x\leq0\\ \quad{}+\frac{y}{M_{3}}f(t,x,-M_{3})+ \frac{xy}{M_{1}M_{3}}f(t,-M_{1},-M_{3}), &-M_{3}\leq y\leq0. \end{cases}\displaystyle \end{aligned}$$
(11)

From (11), we conclude

$$ \bigl\vert h_{1}(t,x,y) \bigr\vert \leq4\sup_{0\leq t\leq2\pi , \vert x \vert \leq M_{1}, \vert y \vert \leq M_{3}} \bigl\vert f(t,x,y) \bigr\vert . $$
(12)

From the above steps, we can deduce the following lemma.

Lemma 2

The function f is denoted by \(f(t,x,y) = h_{1}(t,x,y)+xg_{0}(t,x,y)+yg_{1}(t,x,y)\), whereas \(\vert h_{1}(t,x,y) \vert \leq4\sup\limits_{0\leq t\leq2\pi, \vert x \vert \leq M_{1}, \vert y \vert \leq M_{3}} \vert f(t,x,y) \vert \), \(a\leq g_{0}\leq b\) and \(\vert g_{1} \vert \leq2M_{2}\).

3 Linear equation

Consider the following linear periodic-integrable boundary value problem:

$$\begin{aligned} \begin{aligned} &x''=\hat{g_{1}}(t)x'+ \hat{g_{0}}(t)x+\hat{h_{1}}(t), \\ &x(0)=x({2\pi}),\qquad { \int_{0}^{2\pi}}x(s)\,ds=0, \end{aligned} \end{aligned}$$
(13)

where \(\hat{h_{1}}, \hat{g_{0}}\) and \(\hat{g_{1}}\) satisfy the inequalities in Lemma 2. Furthermore, we consider the corresponding homogeneous linear equation.

Lemma 3

If \(\hat{g_{0}}(t)\geq0\) and \(\hat{g_{0}}(t)\not\equiv0\) on \([0,{2\pi}]\), then the following problem:

$$\begin{aligned} \begin{aligned} &x''=\hat{g_{1}}(t)x'+ \hat{g_{0}}(t)x, \\ &x(0)=x({2\pi}),\qquad { \int_{0}^{2\pi}} x(s)\,ds=0, \end{aligned} \end{aligned}$$
(14)

has only a trivial solution.

Proof

Assume that there exists a nontrivial solution \(x(t)\), that is, \(x(t)\neq0\). From the assumption of \(\hat{g_{0}}(t)\), we known that \(x(t)\) is not constant. So we assert that there exist \(t_{0}\) and \(t_{1}\), such that \(t_{0}< t_{1}\), and we have

$$x(t)>0\quad \mbox{for all }t\in(t_{0},t_{1}),\qquad x'(t_{0})>0\quad \mbox{and}\quad x'(t_{1})=0. $$

Now we prove it. There are two cases:

Case 1. \(x(0)=\eta<0\). Let \(t_{0} =\operatorname{inf}\{{t| t\in[0,{2\pi}]}\mbox{ and } x(t)=0\}\), which implies that \(x(t_{0})=0 \mbox{ and } x'(t_{0})>0\). Define \(t_{\star} =\operatorname{inf}\{{t | t\in(t_{0},{2\pi}]}\mbox{ and }x(t)=0\}\). If \(t_{\star}= t_{0}\), then there will exist the sequences \(\{t^{i}\}\), \(x(t^{i})= 0\) as \(t^{i}\rightarrow t_{0}\) \((i\rightarrow\infty)\). By Rolle’s theorem, there is a number \(\xi^{i}\) in \([t^{i-1},t^{i}]\), such that \(x'(\xi^{i})= 0\), meanwhile \(\xi^{i}\rightarrow t_{0} \), so \(x'(t_{0})= 0\), a contradiction. Therefore \(t_{0}\) is the first zero point and \(t_{\star}\) is the next zero point. By the periodic-integral boundary conditions, there exists \(t_{1}\in[t_{0},t_{\star}]\), such that \(x(t)>0\), for \(t\in(t_{0},t_{1})\), \(x'(t_{1})= 0\).

Case 2. \(x(0)=\eta>0\). By the linear property of the problem, \(-x(t)\) is also a solution. Thus, the case is translated into Case 1.

Multiplying both sides of (14) by \(\operatorname{exp}\{- {\int_{t_{0}}^{t}}{\widehat{g}_{1}(s)\,ds} \}\) and integrating from \(t_{0} \) and \(t_{1}\), we derive

$$0>-x^{'}(t_{0})= \int_{t_{0}}^{t_{1}}{\widehat{g}_{0}(t)x(t)} \biggl\{ \operatorname{exp} \int _{t_{0}}^{t}{\widehat{g}_{1}(s)\,ds}\biggr\} \,dt\geq0, $$

which leads to a contradiction. This proof of Lemma 3 is completed. □

Lemma 4

Problem (13) has a unique solution.

Proof

Let \(x_{1}(t)\) and \(x_{2}(t)\) be two linear independent solutions of the linear homogeneous equation \(x''=\hat{g_{1}}(t)x'+\hat{g_{0}}(t)x\), and \(x=c_{1}x_{1}(t)+c_{2}x_{2}(t)\) is its general solution. Then, by the PIBVP condition, we have

$$\begin{aligned} \textstyle\begin{cases} ((x_{1}(0)-x_{1}(2\pi)) c_{1}+(x_{2}(0)-x_{2}(2\pi))c_{2}=0,\\ {\int_{0}^{2\pi}} x_{1}(s)\,dsc_{1}+ {\int_{0}^{2\pi}} x_{2}(s)\,dsc_{2}=0. \end{cases}\displaystyle \end{aligned}$$

By Lemma 3, Problem (14) has only a trivial solution, which implies

$$ \left \vert \begin{matrix} x_{1}(0)-x_{1}(2\pi)& x_{2}(0)-x_{2}(2\pi)\\ {\int_{0}^{2\pi}} x_{1}(s)\,ds & {\int_{0}^{2\pi}} x_{2}(s)\,ds \end{matrix} \right \vert \neq0. $$
(15)

Assume that \(x_{*}(t)\) is a special solution of equation \(x''=\hat{g_{1}}(t)x'+\hat{g_{0}}(t)x+\hat{h_{1}}(t)\), and \(x=c_{3}x_{1}(t)+c_{4}x_{2}(t)+x_{*}(t)\) is its general solution. By the PIBVP condition, we have

$$ \textstyle\begin{cases} ((x_{1}(0)-x_{1}(2\pi)) c_{3}+(x_{2}(0)-x_{2}(2\pi)) c_{4}=-x_{*}(0)+x_{*}(2\pi),\\ {\int_{0}^{2\pi}} x_{1}(s)\,dsc_{3}+ {\int_{0}^{2\pi}} x_{1}(s)\,dsc_{4}= {\int_{0}^{2\pi}}x_{*}(s)\,ds. \end{cases} $$
(16)

Constants \(c_{3}\) and \(c_{4}\) are unique because of (15) and (16), and therefore Problem (13) has only one solution. The proof is completed. □

4 The proof of Theorem 1

In this section, we will investigate the existence of the solution of Theorem 1 by the Schauder fixed point theorem. Define

$$\mathcal{\mathcal{C}}=\biggl\{ x \Big| x\in C^{1}\bigl([0,2\pi],R \bigr),x(0)=x(2\pi), \int_{0}^{2\pi}x(s)\,ds=0\biggr\} $$

with the norm \(\Vert \bullet \Vert \) defined as follows:

$$\Vert {x} \Vert =\max_{t\in[0,2\pi]} \bigl\vert {x}(t) \bigr\vert +\max_{t\in [0,2\pi]} \bigl\vert {x'}(t) \bigr\vert . $$

It is clear that \(\mathcal{\mathcal{C}}\) is a Banach space.

Applying Lemma 2, for any \({x}\in\mathcal{\mathcal{C}}\), consider

$$\begin{aligned} \begin{aligned} &y''=h_{1} \bigl(t,x,x'\bigr)+yg_{0}\bigl(t,x,x' \bigr)+y'g_{1}\bigl(t,x,x'\bigr), \\ &y(0)=y(2\pi),\quad { \int_{0}^{2\pi}}y(s)\,ds=0. \end{aligned} \end{aligned}$$
(17)

Define the linear operator \(\overline{P}:\mathcal{\mathcal{C}}\rightarrow\mathcal{\mathcal{C}}\). For each \({x}\in \mathcal{\mathcal{C}}\), \(\overline{P}[x](t)=y(t)\) is a solution of (17). Thus the existence of the solution of (1) is equivalent to the existence of the fixed point of in Banach space \(\mathcal{C}\). We will prove that is continuous and compact, and \(\overline{P}(\mathcal{\mathcal{C}})\) is a bounded subset of \(\mathcal{\mathcal{C}}\). The proof is divided into three steps.

Step 1: is continuous. For given any convergent sequence \(\{{x}_{k}\}\subset\mathcal{\mathcal{C}}\), we have \({x}_{k}\rightarrow{x}_{0}\) as \(k\rightarrow\infty\). Let \({y}_{k}=\overline{P}{x}_{k}\), then

$$\begin{aligned} \begin{aligned} &y''_{k}=h_{1} \bigl(t,x_{k},x'_{k}\bigr)+y_{k}g_{0} \bigl(t,x_{k},x'_{k}\bigr)+y'_{k}g_{1} \bigl(t,x_{k},x'_{k}\bigr), \\ &y_{k}(0)=y_{k}(2\pi),\qquad { \int_{0}^{2\pi}}y_{k}(s)\,ds=0. \end{aligned} \end{aligned}$$
(18)

We assert that \(\{{y}_{k}\}\) is the bounded sequence in \(\mathcal{\mathcal{C}}\). Otherwise, there exists a subsequence of \(\{{y}_{k_{j}}\}\), such that \(\Vert {y}_{k_{j}} \Vert \rightarrow \infty\) as \(j\rightarrow\infty\). Let \(\omega_{k_{j}}= \frac{{y}_{k_{j}}}{ \Vert {y}_{k_{j}} \Vert }\). For \(\{\omega_{k_{j}}\}\subset \mathcal{\mathcal{C}}\), then \(\Vert \omega_{k_{j}} \Vert =1\). By Lemma 2 we have

$$\begin{aligned} \begin{aligned} &\omega''_{k_{j}}= \frac{h_{1}(t,x_{k_{j}},x'_{k_{j}})}{ \Vert {y}_{k_{j}} \Vert }+ \omega_{k_{j}}g_{0}\bigl(t,x_{k_{j}},x'_{k_{j}} \bigr)+\omega '_{k_{j}}g_{1}\bigl(t,x_{k_{j}},x'_{k_{j}} \bigr), \\ &\omega_{k_{j}}(0)=\omega_{k_{j}}(2\pi), \qquad{ \int_{0}^{2\pi}}\omega_{k_{j}}(s)\,ds=0. \end{aligned} \end{aligned}$$
(19)

So \(\Vert \omega''_{k_{j}} \Vert \leq 2M_{2}+\max\limits_{t\in[0,{2\pi}]}{b}(t)+C<\infty\), where C is a constant. Thus \(\{\omega''_{k_{j}}\}\) is bounded. Obviously,

$$\begin{aligned} &\omega'_{k_{j}}(t)=\omega'_{k_{j}}(0)+ \int_{0}^{t}\omega''_{k_{j}}(s)\,ds, \end{aligned}$$
(20)
$$\begin{aligned} &\omega_{k_{j}}(t)=\omega_{k_{j}}(0)+ \int_{0}^{t}\omega'_{k_{j}}(s)\,ds. \end{aligned}$$
(21)

Hence, \(\{\omega'_{k_{j}}\}\) and \(\{\omega_{k_{j}}\}\) are both uniformly family bounded degree of equicontinuous functions. By the Ascoli-Arzela theorem, \(\{\omega_{k_{j}}\}\) and \(\{ \omega'_{k_{j}}\}\) contain a uniformly convergent subsequence, respectively. For convenience, we use the same notation and we have

$$\omega_{k_{j}}\stackrel{1}{\longrightarrow}\omega_{0},\qquad \omega '_{k_{j}}\stackrel{1}{\longrightarrow}\upsilon_{0}. $$

From (19) and (20), we obtain

$$\begin{aligned} \omega'_{k_{j}}={}&\omega'_{k_{j}}(0)+ \int_{0}^{t}\omega ''_{k_{j}}(s)\,ds \\ ={}&\omega'_{k_{j}}(0)+ \int_{0}^{t}\biggl( \frac {h_{1}(t,x_{k_{j}},x'_{k_{j}})}{ \Vert {y}_{k_{j}} \Vert }+\omega _{k_{j}}g_{0}\bigl(s,x_{k_{j}},x'_{k_{j}} \bigr) +\omega'_{k_{j}}g_{1}\bigl(s,x_{k_{j}},x'_{k_{j}} \bigr)\biggr)\,ds. \end{aligned}$$
(22)

Let \(j\rightarrow\infty\). From (21) and (22), we obtain

$$\begin{aligned} &\omega_{0}(t)=\omega_{0}(0)+ \int_{0}^{t}\upsilon_{0}(s)\,ds, \\ &\upsilon_{0}=\upsilon_{0}(0)+ \int_{0}^{t}\omega _{0}g_{0} \bigl(s,x_{0},x'_{0}\bigr)+ \upsilon_{0}g_{1}\bigl(s,x_{0},x'_{0} \bigr))\,ds. \end{aligned}$$

Hence,

$$\begin{aligned} &\omega''_{0}= \omega_{0}g_{0}\bigl(t,x_{0},x'_{0} \bigr)+\omega '_{0}g_{1}\bigl(t,x_{0},x'_{0} \bigr), \\ &\omega_{0}(0)=\omega_{0}(2\pi),\qquad { \int_{0}^{2\pi}}\omega_{0}(s)\,ds=0. \end{aligned}$$

By Lemma 3, we can conclude that \(\omega_{0}\equiv0\), and conflicts with \(\Vert \omega_{0} \Vert =1\).

Hence, from (18), we derive \(\{{y''_{k}}\}\) is bounded. So \(\{ y_{k}\}\) and \(\{y'_{k}\}\) are both uniformly family bounded degree of equicontinuous functions. By the Ascoli-Arzela theorem, \(\{y_{k}\}\) and \(\{y'_{k}\}\) contain a uniformly convergent subsequence, respectively. For the sake of convenience, we use the same notation, such that

$$y_{k}\stackrel{1}{\longrightarrow}y_{0},\qquad y'_{k}\stackrel {1}{\longrightarrow}\overline{ \upsilon}_{0}. $$

Thus,

$$\begin{aligned} & y'_{k}=y'_{k}(0)+ \int_{0}^{t}y''_{k}(s)\,ds \\ &\phantom{y'_{k}}=y'_{k}(0)+ \int_{0}^{t}\biggl( \frac{h_{1}(t,x_{k},x'_{k})}{ \Vert {y}_{k} \Vert }+y_{k}g_{0} \bigl(s,x_{k},x'_{k}\bigr)+y'_{k}g_{1} \bigl(s,x_{k},x'_{k}\bigr)\biggr)\,ds, \end{aligned}$$
(23)
$$\begin{aligned} & y_{k}(0)=y_{k}({2\pi}),\qquad \int_{0}^{2\pi}y_{k}(s)\,ds=0,\qquad y_{k}=y_{k}(0)+ \int_{0}^{t}y'_{k}(s)\,ds. \end{aligned}$$
(24)

Let \(k\rightarrow\infty\). From (23) and (24), we obtain

$$\begin{aligned} &y''_{0}=h_{1} \bigl(t,x_{0},x'_{0}\bigr)+y_{0}g_{0} \bigl(t,x_{0},x'_{0}\bigr)+y'_{0}g_{1} \bigl(t,x_{0},x'_{0}\bigr), \\ &y_{0}(0)=y_{0}(2\pi),\qquad { \int_{0}^{2\pi}}y_{0}(s)\,ds=0. \end{aligned}$$

Hence, by the uniqueness we know \({y}_{0}=\overline{P}{x}_{0}\). Thus the operator T is continuous.

Step 2: is compact. For any bounded set \(S\subset{\mathcal{C}}\), we assert that \(\overline{P}(S)\) is the bounded set in \({\mathcal{C}}\). If not, similar to the proof of step 1, we will be led to a contradiction. For any \({x}\in{S}\), \(y=\overline{P}x\) is defined by (17). Because \(\vert {y'} \vert \), \(\vert {y} \vert \), \(\vert {f}_{x} \vert \) and \(\vert {f}_{x'} \vert \) are all bounded, proceeding as the proof of step 1, we show that \(\{y_{k}\}\) and \(\{y'_{k}\}\) are both uniformly family bounded degree of equicontinuous. By the Ascoli-Arzela theorem, is a compact operator.

Step 3: \(\overline{P}(\mathcal{C})\) is a bounded set. If not, there exists a subsequence \(\{x_{k}\},k=1,2,\ldots \) , such that \(\Vert \overline{P}(x_{k}) \Vert \rightarrow\infty\) as \(k\rightarrow\infty\). Let \(y_{k}=\overline{P}x_{k}\), and Problem (4.2) holds. Let \(\omega_{k}= \frac{{y}_{k}}{ \Vert {y}_{k} \Vert }\), then \(\Vert \omega_{k} \Vert =1\) for \(\{\omega_{k}\}\subset{\mathcal{C}}\), and (19), (20), (21) and (22) hold. From step 1, we know \(\{\omega_{k}\}\) and \(\{\omega'_{k}\}\) are both uniformly family bounded degree of equicontinuous functions and contain a uniformly convergent subsequence, respectively. For the sake of convenience, we use the same notation, such that

$$\omega_{k}\stackrel{1}{\longrightarrow}\omega_{0},\qquad \omega'_{k}\stackrel {1}{\longrightarrow} \upsilon_{0}, \qquad \Vert \omega_{0} \Vert =1. $$

The sequences \(g_{0}(t,x_{k},x'_{k})\) and \(g_{1}(t,x_{k},x'_{k})\) are both bounded set in \(L^{2}[0,2\pi] \) and contain a weakly convergent subsequence, respectively, such that

$$g_{0}\bigl(t,x_{k},x'_{k}\bigr) \stackrel{\omega}{\longrightarrow}\overline {g}_{0}(t),\qquad g_{1}\bigl(t,x_{k},x'_{k}\bigr) \stackrel{\omega}{\longrightarrow }\overline{g}_{1}(t). $$

Obviously, as \(k\rightarrow\infty\),

$${a(t)}\leq{\overline{g}_{0}(t)}\leq {b(t)},\qquad \bigl\vert { \overline{g}_{1}(t)} \bigr\vert \leq {2M_{2}}, \quad\mbox{a.e. } t \in{[0,2\pi]}. $$

From (23) and (24), for a.e. \(t\in{[0,2\pi]}\), we have

$$v'_{0}(t)=\overline{g}_{0}(t)v_{0}(t)+ \overline{g}_{1}(t)\omega_{0}(t),\qquad \omega'_{0}(t)=v_{0}(t). $$

Hence,

$$\begin{aligned} &\omega''_{0}(t)= \overline{g}_{0}(t)\omega'_{0}(t)+\overline {g}_{1}(t)\omega_{0}(t), \\ &\omega_{0}(0)=\omega_{0}(2\pi), \qquad{ \int_{0}^{2\pi}}\omega_{0}(s)\,ds=0. \end{aligned}$$

We obtain \(\omega_{0}\equiv0\), this contradicts \(\Vert \omega_{0} \Vert =1\). Then there exists a constant \(K>0\), such that \(\Vert \overline{P}x \Vert \leq{K}\), where \({x}\in{\mathcal{C}}\).

Let \(E=\{{x}\in{\mathcal{C}}\vert \Vert x \Vert \leq{K}\}\). By the fixed point theorem, \(\overline{P}:E\rightarrow E\) has at least one fixed point and thus the PIBVP (1) has at least one solution. The proof of Theorem 1 is completed.

5 The proof of Theorem 2

Firstly, we consider the uniqueness of the solutions of Theorem 2. Let \(x_{1}(t)\) and \(x_{2}(t)\) be any two solutions of the PIBVP (1), then \(u(t)=x_{2}(t)-x_{1}(t)\) is a solution of the PIBVP.

$$\begin{aligned} &u''=f\bigl(t,x_{1},x_{1}' \bigr)-f\bigl(t,x_{2},x_{2}'\bigr) \\ &\phantom{u''}=f_{y}\bigl(t,x_{2},x_{2}+\theta_{2}(x_{1}-x_{2}) \bigr)u'+f_{x}\bigl(t,x_{1}+\theta _{1}(x_{1}-x_{2}),x_{1}' \bigr)u, \\ &u(0)=u({2\pi}),\qquad \int_{0}^{2\pi} u(s)\,ds=0. \end{aligned}$$

Here \({0}\leq\theta_{1}\leq {1}\), \({0}\leq\theta_{2}\leq{1} \). According to Assumption A 3 , we know

$$0 \leq\alpha(t)\leq{f_{x}\bigl(t,x_{1}+ \theta_{1}(x_{1}-x_{2}),x_{1}' \bigr)}\leq \beta(t). $$

Hence, by Lemma 3, \(u(t)\equiv0\) on \([0,2\pi]\), that is, \(x_{1}(t)=x_{2}(t)\).

Next, we will prove the existence of Theorem 2 by the Schauder fixed point theorem. According to the integral mean value theorem, we rewrite the equation of PIBVP (1) in the equivalent form

$$\begin{aligned} x''={}&f\bigl(t,x,x'\bigr) \\ ={}&\bigl(f\bigl(t,x,x'\bigr)-f(t,x,0)\bigr)+\bigl(f(t,x,0)-f(t,0,0) \bigr)+f(t,0,0) \\ ={}& \int_{0}^{1}f_{x'}\bigl(t,x, \theta_{1}x'\bigr)\,d\theta_{1}x'+ \int _{0}^{1}f_{x}(t, \theta_{2}x,0)\,d\theta_{2}x+f(t,0,0). \end{aligned}$$

By Lemma 4, the following problem (25) has a unique solution for any \({x}\in {\mathcal{C}}\):

$$\begin{aligned} \begin{aligned} &y''= { \int_{0}^{1}}f_{x'}\bigl(t,x, \theta_{1}x'\bigr)\,d\theta_{1}y'+ { \int _{0}^{1}}f_{x}(t, \theta_{2}x,0)\,d\theta_{2}y+f(t,0,0), \\ &y(0)=y(2\pi),\qquad { \int_{0}^{2\pi}}y(s)\,ds=0. \end{aligned} \end{aligned}$$
(25)

Define the linear operator \({T}:{\mathcal{C}}\rightarrow{\mathcal{C}}\). For each \({x}\in {\mathcal{C}}\), \({T}[x](t)=y(t)\) is the unique solution of (25). Thus, the existence of the solution of Problem (1) is equivalent to the existence of the fixed point of T in Banach space \(\mathcal{C}\). We will prove that T is continuous and compact, and \({T({\mathcal{C}})}\) is a bounded subset in \({\mathcal{C}}\).

Step 1: T is continuous. Given any convergent sequence \(\{{x}_{j}\}\subset{\mathcal{C}}\), such that \({x}_{j}\rightarrow{x}_{0}\) as \(j\rightarrow\infty\). Let \({y}_{j}={T}{x}_{j}\), then

$$\begin{aligned} \begin{aligned} &y''_{j}= { \int_{0}^{1}}f_{x'}\bigl(t,x_{j}, \theta_{1}x'_{j}\bigr)\,d\theta _{1}y'_{j}+ { \int_{0}^{1}}f_{x}(t, \theta_{2}x_{j},0)\,d\theta _{2}y_{j}+f(t,0,0), \\ &y_{j}(0)=y_{j}(2\pi), \qquad{ \int_{0}^{2\pi}}y_{j}(s)\,ds=0. \end{aligned} \end{aligned}$$
(26)

We will prove the existence of \({y}_{0}\), such that \({y}_{j}\rightarrow{y}_{0}\) as \(j\rightarrow\infty\), and

$$\begin{aligned} &y''_{0}= { \int_{0}^{1}}f_{x'}\bigl(t,x_{0}, \theta_{1}x'_{0}\bigr)\,d\theta _{1}y'_{0}+ { \int_{0}^{1}}f_{x}(t, \theta_{2}x_{0},0)\,d\theta _{2}y_{0}+f(t,0,0), \\ &y_{0}(0)=y_{0}(2\pi), \qquad{ \int_{0}^{2\pi}}y_{0}(s)\,ds=0. \end{aligned}$$

We assert that \(\{{y}_{j}\}\) is the bounded sequence in \({\mathcal{C}}\). If not, there exists a subsequence of \(\{{y}_{j}\}\). For the sake of convenience, this subsequence is still expressed as \(\{{y}_{j}\}\), such that \(\Vert {y}_{j} \Vert \rightarrow\infty\), as \(j\rightarrow\infty\). Take \(\omega_{j}= \frac{{y}_{j}}{ \Vert {y}_{j} \Vert }\). Then \(\Vert \omega_{j} \Vert =1\) for \(\{\omega_{j}\}\subset{\mathcal {C}}\). We have

$$\begin{aligned} \begin{aligned} &\omega''_{j}= { \int_{0}^{1}}f_{x'}\bigl(t,x_{j}, \theta_{1}x'_{j}\bigr)\,d\theta _{1} \omega_{j}'+ { \int_{0}^{1}}f_{x}(t, \theta_{2}x_{j},0)\,d\theta _{2}\omega_{j}+ \frac{f(t,0,0)}{ \Vert y_{j} \Vert }, \\ &\omega_{j}(0)=\omega_{j}(2\pi),\qquad { \int_{0}^{2\pi}}\omega_{j}(s)\,ds=0. \end{aligned} \end{aligned}$$
(27)

So \(\Vert \omega''_{j} \Vert \leq M+\max\limits_{t\in[0,{2\pi}]}{\beta}(t)+1<\infty\). Thus \(\{\omega''_{j}\}\) is bounded. It is easy to see that \(\{\omega'_{j}\}\) and \(\{\omega_{j}\}\) are both uniformly family bounded degree of equicontinuous functions, and

$$\begin{aligned} &\omega'_{j}(t)=\omega'_{j}(0)+ \int_{0}^{t}\omega''_{j}(s)\,ds, \end{aligned}$$
(28)
$$\begin{aligned} &\omega_{j}(t)=\omega_{j}(0)+ \int_{0}^{t}\omega'_{j}(s)\,ds. \end{aligned}$$
(29)

By the Ascoli-Arzela theorem, \(\{\omega'_{j}\}\) and \(\{\omega_{j}\}\) contain a uniformly convergent subsequence, respectively, and satisfy

$$\omega_{j}\stackrel{1}{\longrightarrow}\omega_{0},\qquad \omega'_{j}\stackrel {1}{\longrightarrow} \nu_{0}. $$

Obviously \(\omega_{0}\) and \(\nu_{0}\in{\mathcal{C}}\). Let \(j\rightarrow \infty\). From (27) and (28), we obtain

$$\begin{aligned} &\omega''_{0}= { \int_{0}^{1}}f_{x'}\bigl(t,x_{0}, \theta_{1}x'_{0}\bigr)\,d\theta _{1} \omega_{0}'+ { \int_{0}^{1}}f_{x}(t, \theta_{2}x_{0},0)\,d\theta _{2}\omega_{0}, \\ &\omega_{0}(0)=\omega_{0}(2\pi),\qquad { \int_{0}^{2\pi}}\omega_{0}(s)\,ds=0. \end{aligned}$$

By Lemma 4, \(\omega_{0}\equiv0\), this contradicts \(\Vert \omega_{0} \Vert =1\).

By (26), we derive that \(\{y''_{j}\}\) is bounded. So \(\{y_{j}\}\) and \(\{y'_{j}\}\) are both uniformly family bounded degree of equicontinuous functions. By the Ascoli-Arzela theorem, \(\{y_{j}\}\) and \(\{y'_{j}\}\) contain a uniformly convergent subsequence, respectively. For the sake of convenience, we use the same notation, thus

$$y_{j}\stackrel{1}{\longrightarrow}y_{0},\qquad y'_{j}\stackrel {1}{\longrightarrow}\overline{ \nu}_{0}. $$

We know

$$\begin{aligned} & y'_{j}=y'_{j}(0)+ \int_{0}^{t}y''_{j}(s)\,ds \\ &\phantom{y'_{j}}=y'_{j}(0)+ \int_{0}^{t}\biggl( \int_{0}^{1}f_{x'}\bigl(s,x_{j}, \theta _{1}x'_{j}\bigr)\,d\theta_{1}y'_{j} \\ &\phantom{y'_{j}=}{}+ \int_{0}^{1}f_{x}(s,\theta_{2}x_{j},0)\,d \theta_{2}y_{j}+f(s,0,0)\biggr)\,ds, \end{aligned}$$
(30)
$$\begin{aligned} & y_{j}(0)=y_{j}(2\pi),\qquad \int_{0}^{2\pi}y_{j}(s)\,ds=0,\qquad y_{j}(t)=y_{j}(0)+ \int _{0}^{t}y'_{j}(s)\,ds. \end{aligned}$$
(31)

Let \(j\rightarrow\infty\). From (30) and (31), we obtain

$$\begin{aligned} &y''_{0}= { \int_{0}^{1}}f_{x'}\bigl(t,x_{0}, \theta_{1}x'_{0}\bigr)\,d\theta _{1}y'_{0}+ { \int_{0}^{1}}f_{x}(t, \theta_{2}x_{0},0)\,d\theta _{2}y_{0}+f(t,0,0), \\ &y_{0}(0)=y_{0}(2\pi),\qquad { \int_{0}^{2\pi}}y_{0}(s)\,ds=0. \end{aligned}$$

Hence, by the uniqueness we know \({y}_{0}={T}{x}_{0}\). Thus, operator T is continuous.

Step 2: T is compact. For any bounded set \(S\subset{\mathcal{C}}\), we assert that \({T}(S)\) is the bounded set in \({\mathcal{C}}\). If not, similar to the proof of step 1, we are led to a contradiction. For any \({x}\in{S}\), \(y={T}x\) is defined by (25). Because \(\vert {y'} \vert \), \(\vert {y} \vert \), \(\vert {f}_{x} \vert \), and \(\vert {f}_{x'} \vert \) are all bounded, and then \(\Vert {y'} \Vert <\infty\). Proceeding as in the proof of step 1, we show that \(\{y_{j}\}\) and \(\{y'_{j}\}\) are both uniformly family bounded degree of equicontinuous. By the Ascoli-Arzela theorem, T is a compact operator.

Step 3: \({T}(\mathcal{C})\) is a bounded set. If not, there exists a subsequence \(\{x_{j}\},j=1,2,\ldots \) , such that \(\Vert {T}(x_{j}) \Vert \rightarrow\infty\) as \(j\rightarrow\infty\). Let \(y_{j}={T}x_{j}\), and Problem (26) holds. Take \(\omega_{j}= \frac{{y}_{j}}{ \Vert {y}_{j} \Vert }\), then \(\Vert \omega_{j} \Vert =1\) for \(\{\omega_{j}\}\subset{\mathcal{C}}\), and (27), (28), (29) and (30) hold. From step 1, we know \(\{\omega_{j}\}\) and \(\{\omega'_{j}\}\) are both uniformly family bounded degree of equicontinuous functions and they contain a uniformly convergent subsequence, respectively. For the sake of convenience, we use the same notation, such that

$$\omega_{j}\stackrel{1}{\longrightarrow}\omega_{0},\qquad \omega'_{j}\stackrel {1}{\longrightarrow}\overline{\omega}_{0},\qquad \Vert \omega_{0} \Vert =1. $$

The sequences \(\{\int_{0}^{1}f_{x'}(t,x_{j},\theta_{1}x'_{j})\,d\theta _{1}\}_{k=1}^{\infty}\) and \(\{\int_{0}^{1}f_{x}(t,\theta _{2}x_{j},0)\,d\theta_{2}\}_{k=1}^{\infty}\) are both bounded in \(L^{2}[0,2\pi] \) and contain a weakly convergence subsequence, respectively, such that

$$\int_{0}^{1}f_{x'}\bigl(t,x_{j}, \theta_{1}x'_{j}\bigr)\,d\theta_{1} \stackrel{\omega }{\longrightarrow} {f}_{0}(t),\qquad \int_{0}^{1}f_{x}(t,\theta _{2}x_{j},0)\,d\theta_{2}\stackrel{\omega}{ \longrightarrow} {f}_{1}(t). $$

Obviously,

$$\bigl\vert {{f}_{0}(t)} \bigr\vert \leq{M},\qquad {\alpha(t)} \leq{{f}_{1}(t)}\leq {\beta(t)},\quad \mbox{a.e. } t\in{[0,2\pi]}. $$

Moreover,

$$\begin{aligned} &\omega'_{j}=\omega'_{j}(0)+ \int_{0}^{t}\omega''_{j}(s)\,ds \\ &\phantom{\omega'_{j}}=\omega'_{j}(0)+ \int_{0}^{t}\biggl( \int_{0}^{1}f_{x'}\bigl(s,x_{j}, \theta _{1}{x}'_{j}\bigr)\,d\theta_{1} \omega'_{j} \\ &\phantom{\omega'_{j}=}{}+ \int_{0}^{1}f_{x}(s,\theta_{2}x_{j},0)\,d \theta_{2}\omega_{j}+ \frac {f(s,0,0)}{ \Vert y_{j} \Vert }\biggr)\,ds, \end{aligned}$$
(32)
$$\begin{aligned} & \omega_{j}(0)=\omega_{j}(2\pi),\qquad \int_{0}^{2\pi}\omega_{j}(s)\,ds=0,\qquad \omega _{j}=\omega_{j}(0)+ \int_{0}^{t}\omega'_{j}(s)\,ds. \end{aligned}$$
(33)

Let \(j\rightarrow\infty\). From (32) and (33), for a.e. \(t\in{[0,2\pi]}\), we have

$$\nu'_{0}(t)={f}_{0}(t)\nu_{0}(t)+{f}_{1}(t) \omega_{0}(t),\qquad \omega '_{0}(t)= \nu_{0}(t). $$

Hence

$$\begin{aligned} &\omega''_{0}(t)={f}_{0}(t) \omega'_{0}(t)+{f}_{1}(t)\omega_{0}(t), \\ &\omega_{0}(0)=\omega_{0}(2\pi),\qquad { \int_{0}^{2\pi}}\omega_{0}(s)\,ds=0. \end{aligned}$$

We obtain \(\omega_{0}\equiv0\); this contradicts \(\Vert \omega_{0} \Vert =1\). Then there exists a constant \(K>0\), such that \(\Vert {T}x \Vert \leq{K}\) as \({x}\in{\mathcal{C}}\).

Let \(\overline{E}=\{{x}\in{\mathcal{C}} \vert \Vert x \Vert \leq{K}\}\). By the fixed point theorem, \({T}:\overline{E}\rightarrow\overline{E}\) has one fixed point. The proof of Theorem 2 is completed.

6 Examples

In this section, to illustrate significance and effectiveness of the results, we introduce two examples.

Example 1

Consider the PIBVP as follows:

$$\begin{aligned} \begin{aligned} &x''=f\bigl(t,x,x' \bigr), \\ &x(0)=x({2\pi}),\qquad { \int_{0}^{2\pi}} x(s)\,ds=0, \end{aligned} \end{aligned}$$
(34)

where

$$\begin{aligned} f(t,x,y)= \textstyle\begin{cases} x(t^{2}+\operatorname{sin}y^{2}+1)\sin^{2} \frac{1}{x},&\mbox{$x\neq0$},\\ 0, &\mbox{$x=0$}. \end{cases}\displaystyle \end{aligned}$$

It is clear that f is continuous on \([0,2\pi]\times R^{2}\). For \(\vert x \vert \geq1\), we have

$$ 0\leq \frac {f(t,x,y)}{x}=\bigl(t^{2}+ \operatorname{sin}y^{2}+1\bigr)\sin^{2} \frac{1}{x}\leq t^{2}+2. $$
(35)

Notice that \(x\sin^{2} \frac{1}{x}\) is a bounded function, that is, \(\vert x\sin^{2} \frac{1}{x} \vert \leq M\), where M is a positive constant, for any \(x\in R \). When \(\vert y \vert \geq1\), for all \(t\in[0,2\pi]\), we get

$$ \biggl\vert \frac{f(t,x,y)}{y} \biggr\vert \leq \biggl\vert \frac {x\operatorname{sin}^{2} \frac{1}{x}}{y} \biggr\vert \bigl\vert \bigl(t^{2}+ \operatorname{sin}y^{2}+1\bigr) \bigr\vert \leq M\bigl(t^{2}+2 \bigr)\leq M\bigl(4\pi^{2}+2\bigr). $$
(36)

According to (35) and (36), we derive that f satisfies Assumptions A 1 and A 2 . By Theorem 1, the PIBVP (34) has at least one solution.

Example 2

Consider the following PIBVP:

$$\begin{aligned} \begin{aligned} &x''=\operatorname{sin}x' \operatorname{sin}x+x\bigl(t^{2}+1\bigr), \\ &x(0)=x({2\pi}),\qquad { \int_{0}^{2\pi}} x(s)\,ds=0. \end{aligned} \end{aligned}$$
(37)

Let \(f(t,x,y)=\operatorname{sin}y\operatorname{sin}x+x(t^{2}+1)\). Because

$$\begin{aligned} 0 \leq t^{2}\leq {f_{x}(t,x,y)}=\operatorname{sin}y \operatorname{cos}x+t^{2}+1\leq t^{2}+2 \end{aligned}$$

and

$$\begin{aligned} \bigl\vert {f_{y}(t,x,y)} \bigr\vert \leq \vert \operatorname{cos}y \operatorname{sin}x \vert \leq1, \end{aligned}$$

we prove that f suits Assumptions A 3 and A 4 . By Theorem 2, the PIBVP (37) has a unique solution.