1 Deterministic susceptible-infective-susceptible model

Figure 1 shows a deterministic susceptible-infective-susceptible model for an epidemic disease. In this figure, S is the susceptible population, I is the infective population, μ>0 is the natural death rate, γ>0 is the removal rate which is a constant. Note that S,I0 because they represent the number of people. The infection rate, λ, depends on the number of partners per individual per unit time (r>0) and the transmission probability per partner (β>0). In this system, the first susceptible population in class S is going to be infected, then infected population in class I is going to be susceptible again. The following system of ODE’s describes this susceptible-infective-susceptible model [1]

{ d S ( t ) d t = γ I ( t ) μ λ S ( t ) , d I ( t ) d t = λ S ( t ) μ γ .
(1)
Figure 1
figure 1

A schematic of system (1).

Figure 1 illustrates the system (1). This system is nonlinear due to the form of λ=βI.

2 Generation stochastic susceptible-infective-susceptible model

In this section, we present the state of the generation stochastic [25] susceptible-infective-susceptible model. The stochastic susceptible-infective-susceptible model is similar to the deterministic susceptible-infective-susceptible model, for the deterministic model we can find an exact function but for the stochastic model, we cannot obtain an exact function. Figure 2 shows the state diagram for the stochastic [25] susceptible-infective-susceptible model [1, 610].

Figure 2
figure 2

Stochastic susceptible-infective-susceptible model state diagram.

At t 0 , if m is infective and n is susceptible, namely S( t 0 )+I( t 0 )=n+m, then P i (t)=P[S(t)=i|S(0)=n] is the probability function in the time t and stage i. Here, our goal is to determine P i (t). Table 1 shows the transition diagram for this model. To determine P i (t)’s, we should create the Kolmogorov equations. From Figure 2, we have

P[staying at state i]=1 ( β i ( m + ( n i ) ) Δ t + μ Δ t γ Δ t )
(2)

and

P[moving from state i to i1]=βi ( m + ( n i ) ) Δt+μΔtγΔt.
(3)

Now, to produce the forward Kolmogorov equations, we have

P i ( t + Δ t ) = P ( 0  contact during  Δ t | S ( t ) = i ) P i ( t ) + P ( 1  contact during  Δ t | S ( t ) = i + 1 ) P i + 1 ( t ) = P ( staying at state  i ) P i ( t ) + P ( moving from state  i + 1  to  i ) P i + 1 ( t ) = ( 1 ( β i ( m + ( n i ) ) Δ t + μ Δ t γ Δ t ) ) P i ( t ) + ( β ( i + 1 ) ( m + ( n i 1 ) ) Δ t + μ Δ t γ Δ t ) P i + 1 ( t ) .

So,

P i ( t + Δ t ) P i ( t ) = ( β i ( m + n i ) Δ t + μ Δ t γ Δ t ) P i ( t ) + ( β ( i + 1 ) ( m + n i 1 ) Δ t + μ Δ t γ Δ t ) P i + 1 ( t ) ,
(4)

then

P i ( t + Δ t ) P i ( t ) Δ t = ( β i ( m + n i ) + μ γ ) P i ( t ) + ( β ( i + 1 ) ( m + n i 1 ) + μ γ ) P i + 1 ( t ) .
(5)

Having limited from both sides of Eq. (5), when Δt0, we have

P i ( t ) = lim Δ t 0 P i ( t + Δ t ) P i ( t ) Δ t = ( β i ( m + n i ) + μ γ ) P i ( t ) + ( β ( i + 1 ) ( m + n i 1 ) + μ γ ) P i + 1 ( t ) .
(6)

Therefore, the forward Kolmogorov equations for this model will be as follows:

P i ( t ) = ( β i ( m + n i ) + μ γ ) P i ( t ) + ( β ( i + 1 ) ( m + n i 1 ) + μ γ ) P i + 1 ( t ) .
(7)

The probabilities function P i (t) is found from Eq. (7). Also, from probability generating functions (PGFs) and partial differential functions equations (PDEs), the probabilities function P i (t) can be obtained. Probability generating functions can be written as

y(x,t)= i = 0 n P i (t) x i = P 0 (t)+ P 1 (t)x+ P 2 (t) x 2 ++ P n (t) x n .
(8)

Now, the partial derivative of y(x,t) with respect to t will be y t = i = 0 n P i (t) x i , so one can write the partial derivative as follows:

y t = i = 0 n [ ( β i ( m + n i ) + μ γ ) P i ( t ) + ( β ( i + 1 ) ( m + n i 1 ) + μ γ ) P i + 1 ( t ) ] x i .
(9)

Having simplified Eq. (9), we can write

y ( x , t ) t = ( μ γ ) 1 x + ( γ μ ) y ( x , t ) + β ( m + n 1 ) ( 1 x ) y ( x , t ) x + β x ( x 1 ) 2 y ( x , t ) x 2 .
(10)

The separation variable method is employed to solve Eq. (10). If y(x,t)=X(x)T(t), then we have

T ( t ) T ( t ) = ( γ μ ) + ( μ γ ) 1 x 1 X ( x ) + β ( m + n 1 ) ( 1 x ) X ( x ) X ( x ) + β x ( x 1 ) X ( x ) X ( x ) .
(11)

Two sides of Eq. (11) are equal, so

T ( t ) T ( t ) = c  and  ( γ μ ) + ( μ γ ) 1 x 1 X ( x ) + β ( m + n 1 ) ( 1 x ) X ( x ) X ( x ) + β x ( x 1 ) X ( x ) X ( x ) = c .
(12)

To include a random catastrophe presented by Gani and Swift in 2006 [11], we develop the modified probability generating function (PGF) of a random process to solve Eq. (10) as follows:

G(x,t)= j = 0 n G j (x,t)= j = 0 n [ e ( μ γ ) t y ( x , t ) + 0 t ( μ γ ) e ( μ γ ) v y ( x , v ) d v ] ,
(13)

with μγ0, here y(x,t) is the answer of Eq. (10) when μ=0, γ=0. So, we can put μ=0, γ=0 with m=1 in Eq. (12)

x(x1) X (x)n(x1) X (x)+(c/β)X(x)=0.
(14)

This equation was solved by Bailey in 1963 [12]. After having solved Eq. (14) by Maple, we have

X ( x ) = C 1 × 2 F 1 [ 1 / 2 β + 2 β n + n 2 β 4 c + β ( 1 + n ) β , 1 / 2 β + 2 β n + n 2 β 4 c + β ( 1 + n ) β ; n ; x ] ,
(15)

where F 1 2 [,;;] is a hypergeometric function.

Table 1 Transition diagram for the stochastic susceptible-infective-susceptible model

Notation 1 The standard hypergeometric function F 1 2 [a,b;c;x] is as follows:

F 1 2 [a,b;c;x]= i = 0 ( a ) i ( b ) i ( c ) i × x i i ! ,
(16)

where ( a ) i =a(a+1)(a+2)(a+3)(a+i1) with ( a ) 0 =1 is the Pochhammer symbol. The derivatives of F 1 2 [a,b;c;x] are given by

Also, from equation T ( t ) T ( t ) =c, we have T(t)= k 1 e c t . So,

y ( x , t ) = e c t 1 F 1 2 [ 1 / 2 β + 2 β n + n 2 β 4 c + β ( 1 + n ) β , 1 / 2 β + 2 β n + n 2 β 4 c + β ( 1 + n ) β ; n ; 1 ] × 2 F 1 [ 1 / 2 β + 2 β n + n 2 β 4 c + β ( 1 + n ) β , 1 / 2 β + 2 β n + n 2 β 4 c + β ( 1 + n ) β ; n ; x ] .
(17)

In Eq. (17) we can take c=j(N+mj)β; N=n+ϵ, here ϵ is so small parameter. Then from Eq. (17) and Eq. (13), one can write

G(x,t)= j = 0 n [ e ( μ γ ) t × e j ( N + m j ) β t 1 F 1 2 [ j , j N 1 , N , 1 ]
(18)
× 2 F 1 [j,jN1,N,x]+ 0 t (μγ) e ( μ γ ) v × e j ( m + N j ) β v
(19)
× 1 F 1 2 [ j , j N 1 , N , 1 ] × 2 F 1 [j,jN1,N,v]dv].
(20)

Thus,

G ( x , t ) = j = 0 n λ j × 2 F 1 [ j , j N 1 , N , x ] × [ ( e [ ( μ γ ) + j ( N + m j ) β ] t + 0 t ( μ γ ) e [ ( μ γ ) + j ( N + m j ) β ) ] v d v ] = j = 0 n λ j × 2 F 1 [ j , j N 1 , N , x ] × [ e [ ( μ γ ) + j ( m + N j ) β ] t 1 e ( μ + γ + j β N j 2 β + j β ) t μ + γ + j β N j 2 β + j β ] ,
(21)

where

λ j = ( 1 ) j n ! ( N 2 j + 1 ) N ! Γ ( N + j 1 ) j ! ( n j ) ! ( N n ) ! ( 1 ) n + 1 Γ ( n N + j ) .

Now, to find P 0 (t), P 1 (t), P 2 (t),, P k (t), one can calculate G(x,t) as follows:

G(x,t)= k = 0 P k (t) x k = P 0 (t)+ P 1 (t)x+ P 2 (t) x 2 +.
(22)

So,

P 0 ( t ) = G ( 0 , t ) = j = 0 n λ j × [ e [ ( μ γ ) + j ( m + N j ) β ) ] t 1 e ( μ + γ + j β N j 2 β + j β ) t μ + γ + j β N j 2 β + j β ] .
(23)

Hence, P 1 (t) and P 2 (t) are obtained as follows:

Therefore,

P k (t)= d k G ( x , t ) d x k | x = 0 = j = 0 n 1 k ! ( j ) k ( j N 1 ) k ( N ) k × λ j × Ψ j (t),
(24)

with

Ψ j (t)= [ e j μ t ( n j ) γ t ( m + N j 2 ) β t 1 e ( μ + γ + j β N j 2 β + j β ) t μ + γ + j β N j 2 β + j β ] .
(25)

3 Numerical results

Some numerical examples illustrate the behavior of the probability function P K (t,β,μ,γ) with the following parameters: t=1; β=0.3; μ=0.3; γ=0.1.

Figure 3(a) and (b) show the behavior of the probability functions, P 0 (t,β) and P 8 (t,β) with μ=0.3 and γ=0.1 when 0<t<5 and 0<β<1.

Figure 3
figure 3

Probability function P k (t,β) with μ=0.3 and γ=0.1 , (a): k=0 and (b): k=8 .

Figure 3(a) shows that with an increase in t and β, the probability function P 0 (t,β) increases fast, but in Figure 3(b) P 8 (t,β) decreases fast.

Figure 4(a) and (b) show the behavior of the probability functions, P 0 (t,μ) and P 8 (t,μ) with β=0.3 and γ=0.1 when 0<t<5 and 0<μ<1. Figure 4(a) shows that as t increases, the probability function P 0 (t,μ) increases, but with an increase in μ, the probability function P 0 (t,μ) decreases. Figure 4(b) shows that as t increases, the probability function P 8 (t,μ) decreases, but with an increase in μ, the probability function P 0 (t,μ) increases.

Figure 4
figure 4

Probability function P k (t,μ) with β=0.3 and γ=0.1 , (a): k=0 and (b): k=8 .

Figure 5(a) and (b) display the probability functions, P 0 (t,γ) and P 8 (t,γ) with β=0.3 and μ=0.3. From Figure 5(a) when 0<t<1 and 0<γ<1, the probability function P 0 (t,γ) is nearly zero, but for 1<t<5 as t increases, P 0 (t,γ) slowly increases, and as γ increases, P 0 (t,γ) sharply increases. In Figure 5(b), P 8 (t,γ) is almost constant for 1<t<5, but with a decrease in γ from 1 to 0, P 8 (t,γ) increases; also, Figure 5(b) shows the highest value of P 8 (t,γ) when 0<t<1 and 0<γ<1.

Figure 5
figure 5

Probability function P k (t,γ) with β=0.3 and μ=0.3 , (a): k=0 and (b): k=8 .

Figure 6(a) and (b) show the probability functions, P 0 (β,μ) and P 8 (β,μ) with γ=0.1 and t=1. Figure 6(a) depicts that for 0<β<1, as μ increases from 0 to 1, the probability function P 0 (β,μ) increases. Figure 6(b) shows that for 0<μ<1, with an increase in β and μ, the probability function P 8 (β,μ) increases.

Figure 6
figure 6

Probability function P k (β,μ) with γ=0.1 and t=1 , (a): k=0 and (b): k=8 .

Figure 7(a) and (b) illustrate the probability functions, P 0 (μ,γ) and P 8 (μ,γ) with β=0.3 and t=1. Figure 7(a) shows that the probability function P 0 (μ,γ) decreases with an increase in μ, but inversely it increases with an increase in γ. In Figure 7(a) we observe a separation which means that in the probability function P K (μ,γ) we have μγ. Figure 7(b) depicts that the probability function P 8 (μ,γ) increases when μ increases, although it decreases with an increase in γ.

Figure 7
figure 7

Probability function P k (μ,γ) with β=0.3 and t=1 , (a): k=0 and (b): k=8 .

4 Conclusions

We have presented the generation of a stochastic model for the susceptible-infective-susceptible model. The separation variable method has been applied to solve a partial differential equation of this generation. So, two ordinary differential equations have been achieved which relate to the parameter x. To solve this equation, we used the developed modified probability generating function (PGF) of a random process to consider a random catastrophe. Numerical results showed the behavior of the probability function P k (t,β,μ,γ) when 0<t<5 and 0<β,μ,γ<1.