1 Introduction

The ubiquitous nature of the Internet of Things (IoT), creates a vast array of benefits, as well as a rich set of challenges. The vast number of deployed devices – as well as their mobility potential – creates the possibility of using them to convey and route information among themselves without having to be connected to the Wide Area Network (WAN) all the time. The possibility of gossiping and routing updates in IoT dynamical based networks, and in general, small device networks was explored in the literature for many use cases [4, 14, 15, 17, 37, 40].

We consider a receiver (that we call “sink”) interested only in the most recent update sent by an IoT monitor; the monitor routes the pieces of information via other IoT devices. Given the highly dynamic and unpredictable topology of an IoT network, updates bear the possibility of coming out of order, resulting in a number of those being discarded by the receiver. Fig. 1 gives an example of the previously introduced scenario. The variability in the order of arrival of updates at the receiver end can be approximated well using a queuing system with an infinite number of servers, all serving generated updates with a service time drawn from a random distribution.

The best metric to address timeliness is the Age of Information metric [27]. It is the age of the last received update by the sink, measured from the generation time of said update by the sender. Most of the early literature focused on the average AoI in different queueing systems (for a thorough review of the initial literature we point the reader to the excellent work in [28]). The recent trend has been to instead study the entire distribution of the AoI at the receiver end, since its Survival Function (i.e. one minus its Cumulative Distribution Function (CDF)) gives the probability of the AoI going over a certain threshold, thus allowing a stricter control over Quality of Service (QoS). Particularly, formulas for the Laplace Transform of the AoI distribution were calculated for the G/G/1 First Come First Served (FCFS) system [23]; exact expressions for the AoI distribution for GI/GI/1/1 and GI/GI/1/2* Systems with preemption were calculated in [6], while for M/G/1 Last Come First Serve (LCFS) preemptive systems were given in [19]. Finally, alternative proofs for the AoI distribution in the time domain were developed for other queuing systems (M/M/1 and M/D/1 FCFS) in [20]. Other works involving the stationary distribution of the AoI in queueing systems can be found in [1, 5, 12, 13, 32, 35, 42].

Age aware protocols have been introduced in the literature. Freshness aware Medium Access Control (MAC) protocols were introduced in [18, 24], while IoT latency/age aware protocols were introduced in [7, 10, 11, 26, 29, 36, 39, 41]. Additionally, in [3], a complete real life implementation of an IoT network was carried out and AoI measurements taken. In this work we instead study the performances in terms of timeliness of a protocol-independent multi path IoT network.

Fig. 1
figure 1

An IoT monitor sends updates routed by other IoT devices to a sink; different updates take different routes, thus bearing the possibility of arriving out of order

In this work, the main contribution is the derivation of the closed form expressions for the AoI, peak AoI (pAoI) [8], effective service time and effective departure time distributions for an M/M/\(\infty \) queuing system, based on an alternative proof in the time domain. We then proceed to numerically study its performance based on these metrics. As recently argued in [20], the AoI being a time domain measure, it is important to have expressions in this domain. These expressions that we derive give more immediate intuitions on how QoS constraints can be applied by varying the different parameters involved (e.g. the update generation rate). In previous work [22], the authors provided a formula for the Laplace transform of the AoI in a network of parallel infinite servers, but expressions in the time domain were not presented.

The rest of this paper is organized as follows. In Sect. 2 the scenario is described in detail. In Sect. 3 an expression for update outage probability is derived. In Sect. 4 the expression of the outage probability for the effective departure rate is computed. In Sect. 5 the previous expressions are compared with simulations and numerical results analyzed. Finally in Sect. 6 conclusions are discussed.

2 Scenario description

The source generates pieces of information (i.e. updates) with an average rate of \(\lambda \) updates per second; the servers all serve updates with an average rate of \(\mu \) updates per second. All the updates arrive to a sink, that is interested only in the freshest update generated by the source, thus from the perspective of the sink, an update generated before the freshest update received is without informative value; from the AoI perspective, an outdated update does not contribute to its value.

Specifically, our system consists of an M/M/\(\infty \) queuing system sending updates to a sink. We will call an update that is not discarded an informative update, while an update that is discarded, an obsolete update. As previously stated, the sink is interested only in the freshest updates, thus discarding obsolete updates (i.e. updates generated before the generation time of the last update received). The timestamping part can be included in the payload of the packet, at the application level at the source side (e.g. by using a format similar to an NTP timestamp [30], or a more precise one, depending on the QoS requirements); then it is simply passed transparently to the sink by the other nodes. The sink can then read the timestamp by the same means as the source. Both the inter generation times and the service times follow an exponential distribution i.e. their respective Probability Density Functions (PDFs) are:

$$\begin{aligned} f_X(t) = \lambda \mathrm {e}^{-\lambda t} \mathrm {H}\left( t\right) , \end{aligned}$$
(1)

and

$$\begin{aligned} f_S(t) = \mu \mathrm {e}^{-\mu t} \mathrm {H}\left( t\right) , \end{aligned}$$
(2)

where \( \mathrm {H}\left( t\right) \) is the Heaviside step function defined as:

$$\begin{aligned} \mathrm {H}\left( t\right) = {\left\{ \begin{array}{ll} 1 &{}, t \ge 0 \\ 0 &{}, t < 0 \end{array}\right. } . \end{aligned}$$

The source sends updates about a single information stream i.e. there is only one class of updates. Each server serves updates by drawing service times from the same distribution (2). It is also worth mentioning that for the remainder of the paper the Probability Density Function (PDF) of a random variable X will be expressed as \(f_X(x)\), its Cumulative Distribution Function (CDF) as \(F_X(x) = \Pr \left\{ X \le x \right\} \) and its Survival Function \(G_X(x) = \Pr \left\{ X > x \right\} = 1 - F_X(x)\). Also, we will indicate a multivariate random variable of dimension \(b-a\) as \({\mathbf {X}}_a^b\), where one or both the extremes could be infinite, and its outcome as \({\mathbf {x}}_a^b\). Finally, unless stated otherwise, all the random variables have non negative support.

Fig. 2
figure 2

A typical time period for an M/M/\(\infty \) queueing system. Generation times are marked as \(t_i\), while the corresponding departure times are marked as \(\tau _i\)

In Fig. 2 a typical time period is shown, along with the AoI function \(\varDelta (t)\). Generation times are marked as \(t_i\), while the corresponding departure times (i.e. the times when the sink receives the update) are marked as \(\tau _i\). Update 1 is generated at \(t_1\), and arrives at the sink after a time \(S_1\), at the instant \(\tau _1\). The AoI will then jump to the service time experienced by update 1. Then it will continue to grow with slope 1, until update 2 arrives at the sink, where it again jumps to its service time \(S_2\). Notice that, since update 3 is generated before update 4, but arrives after the latter, it is discarded, i.e., is an obsolete update. The time between two informative updates we call the effective inter-generation time, described by the random variable B. Also, the service time experienced by an informative update we call the effective service time, and the random variable that describes it is Z. Also, the AoI just before the reception of an informative update is called peak AoI (pAoI) – marked as \(\varUpsilon _k\), and it is, as seen in Fig. 2, the sum of the effective inter-arrival time between two informative updates, \(B_k\) and the service time experienced by the second informative update, \(Z_{k}\). Since we consider steady state distributions:

$$\begin{aligned} \varUpsilon = B + Z , \end{aligned}$$

where the two random variables are not statistically independent, as we will see in Sect. 3.2.

Notice that, since both the inter-arrival process and the inter-departure process for informative updates are two identically distributed Poisson point processes subject to the same thinning, they are statistically equivalent, and can be used interchangeably for the purpose of calculating the AoI process. Also, since our system is ergodic, and we are considering the steady state distributions, we can calculate the CDF of the AoI (represented by the random variable \(\varDelta \)) by using [23, Lemma 1]:

$$\begin{aligned} F_{\varDelta }(t) = \lambda _{e} \int _{0}^t F_Z(y) - F_{\varUpsilon }(y) \mathrm {d}y, \end{aligned}$$
(3)

where \(\lambda _e\) is the effective departure rate, expressed in updates per second.

3 Distribution of the age of information

In order to find the update outage probability, we need to find the survival function \(G_\varDelta (t)\) of the AoI \(\varDelta \). In order to do that, we need to find the relevant statistics for the service times for informative updates Z. Also, since we need the pAoI \(\varUpsilon \), being the sum of the inter-arrival times for informative updates B, and the service times for informative updates Z, we need their joint distribution. Once found, they can be used in order to find the CDF of the AoI \(\varDelta \) via (3), to finally find the update outage probability.

3.1 Distribution of the service times for informative updates

First, we notice that the joint PDF of n inter-arrivals \(f_{{\mathbf {X}}_{1}^n} (t)\) is just the product of the PDFs of n independent and identically distributed (i.i.d.) random variables distributed as (1), i.e.:

$$\begin{aligned} f_{{\mathbf {X}}_{1}^n} ({\mathbf {x}}_{1}^n) = \prod _{k=1}^{n} f_X(x_{k}) = \lambda ^n \mathrm {e}^{-\lambda \sum _{k=1}^{n} x_{k}} . \end{aligned}$$

The random variable describing the inter-arrival times after update i, is a vector, with non negative support, indicated as \({\mathbf {X}}_{i+1}^\infty = \left\{ X_{i+1}, X_{i + 2}, \ldots \right\} \), while \({\mathcal {E}} (n) = E_1(i) \cap E_2(n)\) is the event that the update i is informative (event \(E_1(i)\)) and has rendered the previous n updates (event \(E_2(n)\)) obsolete. Notice that, as found in [25, Appendix A] it is independent of the update number i. Further, we notice that the random variable \(Z = S_i |E_1(i)\) describing service time experienced by informative updates could be expressed as:

$$\begin{aligned} f_Z (t)&= \Pr \left\{ S_i = t |E_1(i)\right\} \nonumber \\&= \frac{f_S(t)}{\Pr \left\{ E_1(i)\right\} } \nonumber \\&\quad \times \underbrace{\int _{0}^\infty \cdots \int _{0}^\infty }_{|{\mathbf {x}}_{i+1}^\infty |~\text {times}} \Pr \left\{ E_1(i) |S_i = t, {\mathbf {X}}_{i+1}^\infty = {\mathbf {x}}_{i+1}^\infty \right\} \nonumber \\&\quad \times f_{{\mathbf {X}}_{i+1}^\infty }\left( {\mathbf {x}}_{i+1}^\infty \right) \mathrm {d} {\mathbf {x}}_{i+1}^\infty . \end{aligned}$$
(4)

where \(|{\mathbf {x}} |\) represents the cardinality of the (possibly infinite) set \({\mathbf {x}}\) and [25, Section III-D]:

$$\begin{aligned}&\Pr \left\{ E_1(i) \right\} = \frac{1}{\rho + 1} \nonumber \\&\quad + \sum _{r=1}^\infty \left[ \frac{\rho ^r}{(r+1) \prod _{k=1}^r (\rho + k)} \left( 1 - \frac{\rho }{\rho + r + 1} \right) \right] \nonumber \\&\quad = {}_2 F_2 \left( 1,1;2,\rho + 1; \rho \right) - \frac{\rho }{\rho + 1} {}_2 F_2 \left( 1,1;2,\rho + 2; \rho \right) , \end{aligned}$$
(5)

where we used the definition of the hyper-geometric function \({}_2 F_2 \left( a,b;c,d; e \right) \) [31] to solve the series and \(\rho = \lambda / \mu \) is the load per server. From [25, Eq. (5)] we know:

$$\begin{aligned}&\Pr \left\{ E_1 (i) |S_i = t, {\mathbf {X}}_{i+1}^\infty = {\mathbf {x}}_{i+1}^\infty \right\} \nonumber \\&\quad = \mathbbm {1} \left\{ t< x_{i+1} \right\} + \sum _{r=1}^\infty \left( \mathrm {e}^{-\mu \left( r t - \sum _{k=1}^r (r-k+1) x_{i+k} \right) }\right. \nonumber \\&\left. \qquad \times \mathbbm {1} \left\{ \sum _{k=1}^r x_{i+k}< t < \sum _{k=1}^{r+1} x_{i+k} \right\} \right) , \end{aligned}$$
(6)

where \(\mathbbm {1} \{ E \}\) is the indicator function defined as:

$$\begin{aligned} \mathbbm {1} \{ E \} = {\left\{ \begin{array}{ll} 1 &{}, E\text { is true} \\ 0 &{}, E\text { is false} \end{array}\right. } . \end{aligned}$$

By using (6) in (4), after some algebraic manipulations, we obtain:

$$\begin{aligned} f_Z (t)&= \frac{\mu \mathrm {e}^{-(\lambda + \mu ) t}}{\Pr \left\{ E_1(i)\right\} } \left[ 1 + \sum _{r=1}^\infty \frac{ \left[ \rho \left( 1 - \mathrm {e}^{-\mu t} \right) \right] ^r }{r!} \right] \\&= \frac{\mu \mathrm {e}^{-(\lambda + \mu ) t}}{\Pr \left\{ E_1(i)\right\} } \mathrm {e}^{\rho - \rho \mathrm {e}^{-\mu t}} , \end{aligned}$$

where we used the fact that the future arrivals are all i.i.d.; the associated CDF is then:

$$\begin{aligned} F_Z(t)&= \frac{\mu \mathrm {e}^{\rho }}{\Pr \left\{ E_1(i)\right\} } \int _{0}^{t} \mathrm {e}^{-(\lambda + \mu ) t' - \rho \mathrm {e}^{-\mu t'}} \mathrm {d} t' \nonumber \\&= \frac{\rho ^{-(\rho + 1)} \mathrm {e}^{\rho }}{\Pr \left\{ E_1(i)\right\} } \int _{\rho \mathrm {e}^{-\mu t}}^{\rho } q^{\rho } \mathrm {e}^{-q} \mathrm {d} q \nonumber \\&= \frac{\rho ^{-(\rho + 1)} \mathrm {e}^{\rho }}{\Pr \left\{ E_1(i)\right\} } \left[ \gamma (\rho + 1, \rho ) - \gamma \left( \rho + 1, \rho \mathrm {e}^{-\mu t} \right) \right] , \end{aligned}$$
(7)

where \(\gamma (s,x)\) is the lower incomplete gamma function defined as:

$$\begin{aligned} \gamma (s,x)= \int _{0}^{x}t^{s-1}\,\mathrm {e} ^{-t}\,\mathrm{{d}}t . \end{aligned}$$

3.2 Distribution of the peak age of information

We first need the joint distribution of the inter-arrival times of the previous n updates, and the service time of update i, given that the update i is informative and has rendered the previous n updates obsolete i.e.:

$$\begin{aligned}&f_{{\mathbf {X}}_{i-n}^i, S_i |{\mathcal {E}} (n)} \left( {\mathbf {x}}_{i-n}^i, s_i \right) \nonumber \\&\quad = \frac{f_S(s_i) f_{{\mathbf {X}}_{i-n}^i}\left( {\mathbf {x}}_{i-n}^i \right) }{\Pr \left\{ {\mathcal {E}} (n)\right\} } \nonumber \\&\qquad \times \Pr \left\{ {\mathcal {E}} (n) |S_i = s_i, {\mathbf {X}}_{i-n}^i = {\mathbf {x}}_{i-n}^i \right\} \nonumber \\&\quad = \frac{f_S(s_i) f_{{\mathbf {X}}_{i-n}^i}\left( {\mathbf {x}}_{i-n}^i \right) }{\Pr \left\{ {\mathcal {E}} (n)\right\} } \underbrace{\int \limits _{0}^\infty \cdots \int \limits _{0}^\infty }_{|{\mathbf {x}}_{i + 1}^\infty |~\text {times}} f_{{\mathbf {X}}_{i+1}^\infty } \left( {\mathbf {x}}_{i + 1}^\infty \right) \nonumber \\&\qquad \times \Pr \left\{ {\mathcal {E}} (n) |S_i = s_i, {\mathbf {X}}_{i-n}^\infty = {\mathbf {x}}_{i-n}^\infty \right\} \mathrm {d}{\mathbf {x}}_{i + 1}^\infty , \end{aligned}$$
(8)

where \(\Pr \left\{ {\mathcal {E}} (n)\right\} \) is given in [25, Eq. (9)]:

$$\begin{aligned} \Pr \left\{ {\mathcal {E}} (n)\right\}&= \frac{\lambda ^n \mu }{\prod _{k=1}^{n+1} (\lambda + k \mu )} = \rho ^n \frac{\varGamma (\rho + 1)}{\varGamma (\rho + n + 2)} , \end{aligned}$$
(9)

where we used the recurrence relation of the Gamma function:

$$\begin{aligned} \varGamma (z+1) = z \varGamma (z) \end{aligned}$$

to solve the product; By using conditional independence given \(S_i = s_i\) [25], we can write:

$$\begin{aligned}&\Pr \left\{ {\mathcal {E}} (n) |S_i = s_i, {\mathbf {X}}_{i-n}^\infty = {\mathbf {x}}_{i-n}^\infty \right\} \\&\quad = \Pr \left\{ E_1 (i) |S_i = s_i, {\mathbf {X}}_{i+1}^\infty = {\mathbf {x}}_{i+1}^\infty \right\} \\&\qquad \times \Pr \left\{ E_2(n) |S_i = s_i, {\mathbf {X}}_{i-n}^i = {\mathbf {x}}_{i-n}^i\right\} . \end{aligned}$$

By inserting the previous in (8):

$$\begin{aligned}&f_{{\mathbf {X}}_{i-n}^i, S_i |{\mathcal {E}} (n)} \left( {\mathbf {x}}_{i-n}^i, s_i \right) = f_S(s_i) f_{{\mathbf {X}}_{i-n}^i}\left( {\mathbf {x}}_{i-n}^i \right) \\&\quad \times \frac{ \Pr \left\{ E_2(n) |S_i = s_i, {\mathbf {X}}_{i-n}^i = {\mathbf {x}}_{i-n}^i\right\} }{\Pr \left\{ {\mathcal {E}} (n)\right\} } \\&\quad \times \underbrace{\int \limits _{0}^\infty \cdots \int \limits _{0}^\infty }_{|{\mathbf {x}}_{i + 1}^\infty |~\text {times}} \Pr \left\{ E_1 (i) |S_i = s_i, {\mathbf {X}}_{i+1}^\infty = {\mathbf {x}}_{i+1}^\infty \right\} \\&\quad \times f_{{\mathbf {X}}_{i+1}^\infty } \left( {\mathbf {x}}_{i + 1}^\infty \right) \mathrm {d}{\mathbf {x}}_{i + 1}^\infty . \end{aligned}$$

We notice that the integral in the previous is the same as the integral in (4), so, together with (6) and [25, Eq. (6)] we obtain:

$$\begin{aligned}&f_{{\mathbf {X}}_{i-n}^i, S_i |{\mathcal {E}} (n)} \left( {\mathbf {x}}_{i-n}^i, s_i \right) \\&\quad = \lambda \mu ^{n+1} \frac{\varGamma (\rho + n + 2)}{\varGamma (\rho + 1)} \mathrm {e}^{\rho - \rho \mathrm {e}^{-\mu s_i} -\left[ \lambda + (n+1) \mu \right] s_i} \\&\qquad \times \left[ \mathrm {e}^{ - \sum _{k=0}^{n} (\lambda + k \mu ) x_{i-n + k}} \right. \\&\left. \qquad - \mathrm {e}^{ - \mu s_i - \sum _{k=1}^{n+1} (\lambda + k \mu ) x_{i-n + k-1}} \right] . \end{aligned}$$

We call \(B'(i,n)\) the random variable describing the sum of the previous n inter-arrivals with respect to update i, and B(n) the sum of the inter-arrival times of the previous n updates given that the update i is informative and has rendered the previous n updates obsolete. We notice that:

$$\begin{aligned} B(n) = B'(i,n) |{\mathcal {E}} (n) = \sum _{k=i-n}^i X_k |{\mathcal {E}} (n) \, , \end{aligned}$$

so we use the previous in order to find the joint PDF of B(n) and \(Z(n) = S_i |{\mathcal {E}} (n)\) as:

$$\begin{aligned}&f_{B(n), Z(n)} (t, s) = f_{B'(i,n), S_i |{\mathcal {E}} (n)} (t, s)\nonumber \\&\quad = \underbrace{ \int \limits _{0}^{+ \infty } \cdots \int \limits _{0}^{+ \infty } }_\text {n - 1 times} f_{{\mathbf {X}}_{i-n}^i, S_i |{\mathcal {E}} (n)} \left( {\mathbf {x}}_{i-n+1}^i , t - \sum _{k=0}^{n-1} x_{i-k}, s_i \right) \nonumber \\&\quad \mathrm {d}{\mathbf {x}}_{i-n+1}^i = \frac{\lambda \mu \varGamma (\rho + n + 2) }{ n! \varGamma (\rho + 1)} \mathrm {e}^{\rho - \rho \mathrm {e}^{-\mu s} -\left[ \lambda + (n+1) \mu \right] s } \nonumber \\&\qquad \times \mathrm {e}^{- \lambda t} \left( 1 - \mathrm {e}^{-\mu t} \right) ^n \left( 1 - \mathrm {e}^{-\mu (s + t)} \right) . \end{aligned}$$
(10)

Notice that, in the above, the dependency between the two random variables rests on the last term, i.e. it tends to disappear as \(\mu \) increases. The effective inter-arrival time B is the sum of the inter-arrivals between two informative updates, i.e. \(B = \sum _{k=i-N}^i X_k |E_1(i) = B(N) |E_1(i)\), where N is the random variable describing the number of previous updates rendered obsolete by the informative update i. Reasoning the same way, the effective service time will be \(Z = Z(N) |E_1(i)\). We notice:

$$\begin{aligned} f_{B,Z} (t)&= f_{B(N), Z(N) |E_1(i)} (t) \nonumber \\&= \frac{f_{B(N) \cap Z(N) \cap E_1(i)} (t)}{\Pr \left\{ E_1(i)\right\} } \nonumber \\&= \frac{\sum _{n=0}^\infty f_{B'(i,n) \cap S_i \cap E_1(i) \cap E_2(n)} (t)}{\Pr \left\{ E_1(i)\right\} } \nonumber \\&= \frac{\sum _{n=0}^\infty f_{B'(i,n) \cap S_i \cap {\mathcal {E}} (n)} (t)}{\Pr \left\{ E_1(i)\right\} } \nonumber \\&= \frac{\sum _{n=0}^\infty f_{B'(i,n), S_i |{\mathcal {E}} (n)} (t) \Pr \left\{ {\mathcal {E}}(n)\right\} }{\Pr \left\{ E_1(i)\right\} } \nonumber \\&= \frac{\sum _{n=0}^\infty f_{B(n), Z(n)} (t) \Pr \left\{ {\mathcal {E}}(n)\right\} }{\Pr \left\{ E_1(i)\right\} } . \end{aligned}$$
(11)

By combining (10) and (9), after some algebraic manipulations we obtain:

$$\begin{aligned}&\sum _{n=0}^\infty f_{B(n), Z(n)} (t) \Pr \left\{ {\mathcal {E}}(n)\right\} \\&\quad = \lambda \mu \left( 1 - \mathrm {e}^{-\mu (s + t)} \right) \mathrm {e}^{\rho - \rho \mathrm {e}^{-\mu (s+t)} - (\lambda + \mu ) s - \lambda t} . \end{aligned}$$

Then, by using the previous in (11) we obtain:

$$\begin{aligned} f_{B,Z} (s,t) = \frac{\lambda \mu \left( 1 - \mathrm {e}^{-\mu (s + t)} \right) }{\Pr \left\{ E_1(i)\right\} } \mathrm {e}^{\rho - \rho \mathrm {e}^{-\mu (s+t)} - (\lambda + \mu ) s - \lambda t} . \end{aligned}$$

We now need to find the PDF of \(\varUpsilon = B + Z\). By using the previous:

$$\begin{aligned} f _{\varUpsilon } (t)&= \int _{0}^{\infty } f_{B,Z} (s,t-s) \mathrm {d}s \\&= \frac{\lambda \mathrm {e}^{ - \lambda t}}{\Pr \left\{ E_1(i)\right\} } \left( 1 - \mathrm {e}^{-\mu t} \right) ^2 \mathrm {e}^{\rho - \rho \mathrm {e}^{-\mu t}} . \end{aligned}$$

Finally, the CDF will be:

$$\begin{aligned} F_{\varUpsilon } (t)&= \frac{\lambda \mathrm {e}^{\rho }}{\Pr \left\{ E_1(i)\right\} } \int _0^t \mathrm {e}^{ - \lambda t'} \left( 1 - \mathrm {e}^{-\mu t'} \right) ^2 \mathrm {e}^{ - \rho \mathrm {e}^{-\mu t'}} \mathrm {d}t' \nonumber \\&= \frac{\rho ^{-(\rho + 1)} \mathrm {e}^{\rho }}{\Pr \left\{ E_1(i)\right\} } \int _{\rho \mathrm {e}^{-\mu t}}^{\rho } q^{\rho - 1} (\rho - q)^2 \mathrm {e}^{-q} \mathrm {d} q \nonumber \\&= \frac{\rho ^{-(\rho + 1)} \mathrm {e}^{\rho }}{\Pr \left\{ E_1(i)\right\} } \left[ \chi (\rho ) - \chi (\rho \mathrm {e}^{-\mu t}) \right] , \end{aligned}$$
(12)

where:

$$\begin{aligned} \chi (x) = \rho ^2 \gamma (\rho , x) - 2 \rho \gamma (\rho + 1, x) + \gamma (\rho + 2, x) . \end{aligned}$$

3.3 Distribution of the age of information

The effective rate is simply the arrival rate \(\lambda \) multiplied by the probability of an update being informative (5). By using the previous observation, and substituting (7) and (12) in (3), we obtain:

$$\begin{aligned} G _{\varDelta } (t)&= 1 - \lambda \rho ^{-(\rho + 1)} \mathrm {e}^{\rho } \int _0^t (1 + 2 \rho ) \gamma (\rho + 1, \rho ) \nonumber \\&\quad - \rho ^2 \gamma (\rho , \rho ) - (1 + 2 \rho ) \gamma \left( \rho + 1, \rho \mathrm {e}^{-\mu t'} \right) \nonumber \\&\quad + \rho ^2 \gamma \left( \rho , \rho \mathrm {e}^{-\mu t'}\right) + \gamma \left( \rho + 2, \rho \mathrm {e}^{-\mu t'}\right) \nonumber \\&\quad - \gamma (\rho + 2, \rho ) \mathrm {d} t' \nonumber \\&= 1 - \lambda \rho ^{-(\rho + 1)} \mathrm {e}^{\rho } \left\{ \left[ (1 + 2 \rho ) \gamma (\rho + 1, \rho ) - \rho ^2 \gamma (\rho , \rho ) \right. \right. \nonumber \\&\left. \left. \quad - \gamma (\rho + 2, \rho ) \right] t - (1 + 2 \rho ) \omega (1) + \rho ^2 \omega (0) + \omega (2) \right\} , \end{aligned}$$
(13)

where, by using [34, Eq. (2.10.1.1)] in order to simplify the integral:

$$\begin{aligned} \zeta (a,b)&= \mu ^{-1} \int _{0}^b q^{-1} \gamma (\rho + a, \rho q ) \mathrm {d}q \\&= \mu ^{-1} (\rho b)^{\rho + a} \sum _{k=0}^\infty \frac{(-b \rho )^{k}}{k! (\rho + a + k)^2} , \end{aligned}$$

and:

$$\begin{aligned} \omega (a) = \zeta (a,1) - \zeta (a,\mathrm {e}^{-\mu t}) . \end{aligned}$$

4 Distribution of the inter–arrival times for informative updates

The random variable for the inter-arrival times for informative updates is statistically identical to the random variable for the inter-departure times for informative updates. It allows to compute not only the average rate – already known in the literature (9), but also all the statistics relative to the effective rate departing from the system.

We make use of the distribution of the inter-arrival times of the previous n updates given that the update i is informative and has rendered the previous n updates obsolete in [25, Appendix C]. The random variable describing the previous n inter-arrival times is a vector, with non negative support, indicated as \({\mathbf {X}}_{i-n}^i = \left\{ X_{i-n}, \ldots ,X_{i} \right\} \). After some algebraic modifications we have:

$$\begin{aligned}&f _{{\mathbf {X}}_{i-n}^i |{\mathcal {E}} (n)} \left( {\mathbf {x}}_{i-n}^i \right) \nonumber \\&\quad = \lambda \mu ^n \frac{\varGamma (n + \rho + 2)}{\varGamma (\rho + 1)} \left( \frac{1}{\rho + n + 1} + \sigma (n) \right) \nonumber \\&\qquad \times \mathrm {e}^{-\sum _{k=0}^{n-1} [\lambda + \mu (n-k)] x_{i-k}} \mathrm {e}^{-\lambda x_{i-n}} \nonumber \\&\qquad - \lambda \mu ^n \frac{ (\rho + n + 1) \varGamma (n + \rho + 2)}{\rho \varGamma (\rho + 1)} \sigma (n) \nonumber \\&\qquad \times \mathrm {e}^{-\sum _{k=0}^{n-1}[\lambda + \mu (n-k+1)] x_{i-k}} \mathrm {e}^{-(\lambda + \mu ) x_{i-n}} . \end{aligned}$$
(14)

Also:

$$\begin{aligned} \sigma (n)&= \varGamma (\rho + n + 1) \sum _{r=1}^\infty \left[ \frac{\rho ^r}{\varGamma (r + \rho + n + 2)} \right] \\&= \mathrm {e}^{\rho } \frac{\rho ^{-(\rho + n + 1)} }{\rho + n + 1} \gamma (\rho + n + 2, \rho ) \, , \end{aligned}$$

where we used [33, Eq. (5.2.7.20)] to solve the sum.

Using the same reasoning as in Sect. 3.2, we use (14) in order to find the PDF of B(n) as:

(15)

As we did in Sect. 3.2. We notice:

$$\begin{aligned} f_B (t)&= f_{B(N) |E_1(i)} (t) \nonumber \\&= \frac{f_{B(N) \cap E_1(i)} (t)}{\Pr \left\{ E_1(i)\right\} } = \frac{\sum _{n=0}^\infty f_{B'(i,n) \cap E_1(i) \cap E_2(n)} (t)}{\Pr \left\{ E_1(i)\right\} } \nonumber \\&= \frac{\sum _{n=0}^\infty f_{B'(i,n) \cap {\mathcal {E}} (n)} (t)}{\Pr \left\{ E_1(i)\right\} } \nonumber \\&= \frac{\sum _{n=0}^\infty f_{B'(i,n) |{\mathcal {E}} (n)} (t) \Pr \left\{ {\mathcal {E}}(n)\right\} }{\Pr \left\{ E_1(i)\right\} } \nonumber \\&= \frac{\sum _{n=0}^\infty f_{B(n)} (t) \Pr \left\{ {\mathcal {E}}(n)\right\} }{\Pr \left\{ E_1(i)\right\} } . \end{aligned}$$
(16)

By combining (15) and (9), after some algebraic manipulations we obtain:

$$\begin{aligned} f_{B(N) \cap E_1(i)} (t)&= \sum _{n=0}^\infty f_{B(n)} (t) \Pr \left\{ {\mathcal {E}}(n)\right\} \\&= \lambda \mathrm {e}^{- \lambda t} \sum _{n=0}^\infty \frac{\left[ \rho (1 - \mathrm {e}^{-\mu t}) \right] ^n}{n! (n + \rho + 1)} \\&\quad + \lambda \mathrm {e}^{\rho - \lambda t} \rho ^{-\rho - 1} \sum _{n=0}^\infty \frac{(1 - \mathrm {e}^{-\mu t})^n}{n! (n + \rho + 1)} \gamma (\rho + n + 2, \rho ) \\&\quad - \lambda \mathrm {e}^{\rho - (\lambda + \mu ) t} \rho ^{-\rho - 2} \sum _{n=0}^\infty \frac{(1 - \mathrm {e}^{-\mu t})^n}{n!} \gamma (\rho + n + 2, \rho ) \\&= \lambda \mathrm {e}^{- \lambda t} \sum _{n=0}^\infty \frac{\left[ \rho (1 - \mathrm {e}^{-\mu t}) \right] ^n}{n! (n + \rho + 1)} \\&\quad + \lambda \mathrm {e}^{\rho - \lambda t} \rho ^{-\rho - 1} \sum _{n=0}^\infty \frac{(1 - \mathrm {e}^{-\mu t})^n}{n!} \gamma (\rho + n + 1, \rho ) \\&\quad - \lambda \mathrm {e}^{- \lambda t} \sum _{n=0}^\infty \frac{\left[ \rho (1 - \mathrm {e}^{-\mu t}) \right] ^n}{n! (n + \rho + 1)} \\&\quad - \lambda \rho ^{-\rho - 2} \mathrm {e}^{\rho + \mu t} \gamma \left( \rho + 2, \rho \mathrm {e}^{-\mu t} \right) \\&= \lambda \mathrm {e}^{\rho + \mu t} \rho ^{-\rho - 1} \left[ \gamma \left( \rho + 1, \rho \mathrm {e}^{-\mu t} \right) \right. \\&\left. \quad - \frac{1}{\rho } \gamma \left( \rho + 2, \rho \mathrm {e}^{-\mu t} \right) \right] , \end{aligned}$$

where we used the recurrence relation for the incomplete gamma function:

$$\begin{aligned} \gamma \left( a+1, x \right) = a \gamma \left( a, x \right) - x^a \mathrm {e}^{-x} , \end{aligned}$$

and, subsequently, [34, Eq. (5.2.3.1)] for the two surviving sums. Then, by using the previous in (16) we obtain:

$$\begin{aligned} f_B (t)&= \frac{\lambda \mathrm {e}^{\rho + \mu t} \rho ^{-\rho - 1}}{\Pr \left\{ E_1(i)\right\} } \left[ \gamma \left( \rho + 1, \rho \mathrm {e}^{-\mu t} \right) \right. \\&\left. \quad - \frac{1}{\rho } \gamma \left( \rho + 2, \rho \mathrm {e}^{-\mu t} \right) \right] . \end{aligned}$$

Finally, its survival function is:

$$\begin{aligned} G_B (t)&= 1 - \frac{\lambda \mathrm {e}^{\rho } \rho ^{-\rho - 1}}{\Pr \left\{ E_1(i)\right\} } \left[ \int _{0}^t \mathrm {e}^{\mu t'} \gamma \left( \rho + 1, \rho \mathrm {e}^{-\mu t'} \right) \mathrm {d}t' \right. \nonumber \\&\left. \quad - \rho ^{-1} \int _{0}^t \mathrm {e}^{\mu t'} \gamma \left( \rho + 2, \rho \mathrm {e}^{-\mu t'} \right) \mathrm {d}t' \right] \nonumber \\&= 1 - \frac{\lambda \mathrm {e}^{\rho } \rho ^{-\rho - 1}}{\Pr \left\{ E_1(i)\right\} } \left[ \alpha (1,1) - \alpha (1,\mathrm {e}^{-\mu t}) \right. \nonumber \\&\left. \quad - \rho ^{-1} \alpha (2,1) + \rho ^{-1} \alpha (2,\mathrm {e}^{-\mu t}) \right] \, , \end{aligned}$$
(17)

where, by using [34, Eq. (2.10.1.1)] in order to simplify the integral:

$$\begin{aligned} \alpha (a,b)&= \mu ^{-1} \int _{0}^b q^{-2} \gamma (\rho + a, \rho q ) \mathrm {d}q \\&= \mu ^{-1} \rho ^{\rho + a} b^{\rho + a - 1} \\&\quad \times \sum _{k=0}^\infty \frac{(-b \rho )^{k}}{k! (\rho + a + k) (\rho + a + k - 1)} \, . \end{aligned}$$

5 Numerical results

We conducted simulation studies using OMNeT++ [38]. We fixed \(\lambda = 100\) updates/sec and let \(\mu \) vary between 50 and 200 updates per second. All plots involving simulations are presented with 95% confidence intervals, allowing for a sufficient warm-up period before taking measurements; in some points the intervals are too tight to show at 95% confidence. All the plots make use of a black and white printer-friendly and accessible color scheme [9]. As we can see in Fig. 3 and Fig. 4, the analytical findings all agree with the simulations.

We then investigated the effects of the update generation rate \(\lambda \) for threshold varying from 10 to 100 ms for different service rates \(\mu \) (Fig.  5); the boundaries for the threshold are choosen to be between the reaction time for haptic internet [16] and the update time needed in Personal Area Networks of energy harvesting sensors for medical applications [2, 21]. Given the large span of values on the Z axis, we chose to apply a logarithmic scale on the latter. As we can see we have an exponential effect on the update outage probability, with respect to all the involved parameters. Finally, as an example, we plotted the outage probability vs both the average generation rate \(\lambda \) and the the threshold (Fig. 6), with \(\mu \) fixed at 100 updates per second. It is a contour plot, where there are isolevel lines for the updated outage probability for constraints of interest. If a designer finds itself with a constrained average service time, Fig. 6 is a useful tool for choosing an average update generation rate for a given QoS.

Fig. 3
figure 3

Update outage probability for an M/M/\(\infty \) system (13); Simulation vs analytical

Fig. 4
figure 4

Threshold violation for the inter departure times of an M/M/\(\infty \) system (17); Simulation vs analytical

Fig. 5
figure 5

Effects of the update generation rate on the Update outage probability (13); notice the logarithmic scale on the Z axis

Fig. 6
figure 6

Contour plot for varying \(\lambda \) and threshold; \(\mu \) is fixed at 100 updates per second

6 Conclusions

In this work, we studied the update violation probability in an IoT routing scenario in the time domain. We highlighted that in such a scenario, the dynamical nature of the network can very well result in updates sent by an IoT monitor arriving out of order at a receiver. Thus we argued the importance of having the expression of the update outage probability in the time domain, to better understand the interplay between the various parameters involved and in order to ensure a sufficient QoE.

Particularly, we obtained the exact expressions for the AoI, peak AoI (pAoI), effective service time and effective departure time distributions for an M/M/\(\infty \) queuing system, and from their survival functions, derived the corresponding violation probabilities. Numerical results were obtained, providing the designer of IoT update systems with a tool to estimate QoS parameters given a statistical constraint.