1 Introduction

This paper deals with the well-studied GI/G/1 queue. In many ways, this stochastic model lives in the centre of queueing theory and applied probability as it marks the border of tractable explicit models (e.g. M/G/1) and models for which asymptotic approximations are needed. Living on the explicitly intractable side of the border, GI/G/1 has motivated much early works in asymptotic approximate queueing theory such as [5, 13, 15]. Indeed, GI/G/1 is an extremely studied model (see [2, Chapter X] for an overview). Nevertheless, in this paper, we add new results to the body of knowledge about GI/G/1.

There are several stochastic processes and random variables associated with GI/G/1. Of key interest to us is the queue length process (including the customer in service), \(\{Q(t),~t\ge 0\}\), the departure counting process (indicating the number of service completions during [0, t]), \(\{D(t),~t \ge 0\}\), the busy-period random variable, B, as well as several other processes which we describe in the sequel. The processes \(Q(\cdot )\), \(D(\cdot )\) and the random variable B are constructed in a standard way (see, e.g. [2, Chapter X]) on a probability space supporting two independent i.i.d. sequences of strictly positive random variables. Namely, \(\{U_i\}_{i=1}^\infty \) denote the inter-arrival times and \(\{V_i\}_{i=1}^\infty \) denotes the service times. Note that in this paper we assume that an arrival to an empty system occurs at time 0 yielding service duration \(V_1\) and then after time \(U_1\) the next arrival occurs with service duration \(V_2\) and so forth.

As in many, but not all, of the GI/G/1 studies, our focus is on the very special critical case, i.e. we assume,

$$\begin{aligned} \mathbb {E}[U_1] = \mathbb {E}[V_1]:=\lambda ^{-1}. \end{aligned}$$
(1)

We also assume that \(\mathbb {E}[U_1^2],\, \mathbb {E}[V_1^2] < \infty \), that is, we are in the case where the inter-arrival and service sequences obey a Gaussian central limit law.

Critical GI/G/1 has been an exciting topic for research for many reasons. It lies on the border of stability (\(\mathbb {E}[U_1] > \mathbb {E}[V_1]\)) and instability (\(\mathbb {E}[U_1] < \mathbb {E}[V_1]\)). In the near-critical but stable case, sojourn time random variables are approximately exponentially distributed (under some regularity assumptions). Further the queue length and/or workload processes converge to reflected Brownian motion when viewed through the lens of diffusion scaling. Such scaling can be done both in the near-critical case or exactly at criticality. This is then a much more tractable object for further analysis, in comparison with the associated random walks that are used in the non-critical case. See, for example, [20] for a comprehensive treatment of diffusion scaling limits. A useful survey is also in [11]. So, in general, analysis in the critical case has attracted much attention.

A further specialty arising in the critical case is the well-known fact that the while the busy-period random variable is finite w.p. 1, it has infinite expectation. One of the contributions of this paper is that we further establish (under the aforementioned finite-second moment assumptions), that the busy period is a regularly varying random variable with index 1/2, where the associated slowly varying function is bounded from below. This property of the critical busy period is well-known and essentially elementary to show for the M/M/1 case. Further, for the M/G/1 case it was established as a side result in [21] with exact tail asymptotics. One of the main result of the current paper is that we find similar exact asymptotics for the GI/G/1 case. Tails and related properties of the busy period for GI/G/1 and related models have received much attention during the past 15 years. In addition to [21], the busy period has been studied under various conditions in [3, 4, 6, 7, 10, 14, 16, 18]. Our contribution to the body of knowledge in the critical case adds to this.

One of the important reasons for studying the tail behaviour of the critical busy-period is in relation to the BRAVO effect (Balancing Reduces Asymptotic Variance of Outputs). This is a phenomenon occurring in a variety of queueing systems (see, for example, [17]), and specifically for the critical GI/G/1. For systems without buffer limitations such as GI/G/1, it was first presented in [1]. The (somewhat counterintuitive) BRAVO effect is that the long-term variability of the output counts is less than the average variability of the arrival and service process. This is with respect to the asymptotic variance,

$$\begin{aligned} \overline{v}:= \lim _{t \rightarrow \infty } \frac{\mathrm{var}\big (D(t)\big )}{t} = \lambda (c_a^2+c_s^2) \left( 1- \frac{2}{\pi } \right) , \end{aligned}$$

where \(c_a^2\) and \(c_s^2\) are the squared coefficient of variation of the building block random variables, i.e.

$$\begin{aligned} c_a^2:= \frac{\mathrm{var}(U_1)}{\mathbb {E}[U_1]^2}, \qquad c_s^2:= \frac{\mathrm{var}(V_1)}{\mathbb {E}[V_1]^2}. \end{aligned}$$

It thus follows that the variability function \(\lim _{t\rightarrow \infty } \mathrm{var} D(t)/\mathbb {E} D(t)\), when considered as a function of the system load, has a singular point at the system load equal 1, which can be regarded as a manifestation of the BRAVO phenomenon. More specifically, \(\overline{v}\) is essentially determined by either the arrival or the service process when for the system load is not equal 1, whereas for the system load equal 1 it is determined by both the arrival and service processes.

In [1], the BRAVO effect for GI/G/1 was established for M/M/1 by first principles and for the GI/G/1 it was derived via a classic diffusion limits for \(D(\cdot )\), [13]. Following the development of [1], a key technical component for a complete proof of GI/G/1 BRAVO is uniform integrability (UI) of the sequence,

$$\begin{aligned} \left\{ \frac{Q(t)^2}{t},~t \ge t_0 \right\} , \end{aligned}$$

for some non-negative \(t_0\). This UI property was only established in [1] under the simplifying assumption that \(\mathbb {E}[U_1^4],~ \mathbb {E}[V_1^4] < \infty \) and further under the assumption that the tail behaviour of the busy period was regularly varying with index half (as we establish in the current paper).

It was further conjectured in [1] that GI/G/1 BRAVO persists under the more relaxed assumptions that we consider here. By using new tail behaviour results for the busy period and further considering properties of renewal processes that generalise some results in [1], we settle this conjecture in the current paper under weaker assumptions of existence of \(2+\epsilon \) moments of generic inter-arrival and service time.

The structure of the remainder of the paper is as follows: In Sect. 2, we study the busy-period tail behaviour and compare it with the M/M/1 and M/G/1 cases. In Sect. 3, we establish GI/G/1 BRAVO under the relaxed assumptions.

2 Busy period

For analysis of the busy period, it is useful to denote \(\xi _i:= V_i-U_{i}\) for \(i=1,2,\ldots \) and

$$\begin{aligned} S_n := \sum _{i=1}^n \xi _i\quad \text {with }S_0=0. \end{aligned}$$
(2)

The random walk \(S_n\) is embedded within the well-known workload process. Within the first busy period, \(S_n\) denotes the workload of the system immediately after the arrival of customer n. Then the number of customers served during this busy period is \(N:= \inf \{ n \ge 1: S_n \le 0\}\). With N at hand we can then define the busy-period (duration) random variable as,

$$\begin{aligned} B := \sum _{i=1}^N V_i. \end{aligned}$$
(3)

Note of course that N and the sequence \(\{V_i\}\) are generally dependent and further note that N is a stopping time with infinite expectation. We will also need the generic idle period that follows the busy period which equals

$$\begin{aligned} I:= -S_N. \end{aligned}$$

We recall that in this paper we assume the critical case when (1) holds. we will write \(f(x)\sim g(x)\) when \(\lim _{x\rightarrow \infty } f(x)/g(x)=1\). The main result of this section is given by the following theorem

Theorem 2.1

Consider the case where \(\mathbb {E}[U_1] = \mathbb {E}[V_1]\) and \(\mathbb {E}[U_1^2],~ \mathbb {E}[V_1^2]< \infty \). Then,

$$\begin{aligned} \mathbb {P}\big ( B > x \big ) \sim \mathbb {E}I \sqrt{\frac{2\lambda }{\pi (c_a^2+c_s^2)}} x^{-1/2}. \end{aligned}$$

Proof

From [9, Thm. XVIII.5.1, p. 612], we know that the series

$$\begin{aligned} \sum _{n=1}^\infty \left[ \mathbb {P}(S_n<0)-\frac{1}{2} \right] \end{aligned}$$

converges to some \(b>0\) and

$$\begin{aligned} \mathbb {E}I=-\mathbb {E}S_N=\frac{\sigma }{\sqrt{2}}e^{-b} \end{aligned}$$
(4)

where

$$\begin{aligned} \sigma ^2=var(V_1-U_1)=(c_a^2+c_s^2)\lambda ^{-2}. \end{aligned}$$

Moreover, from [9, Thm. XII.7.1a, p. 415], we have

$$\begin{aligned} \mathbb {P}(N>n)\sim \frac{1}{\sqrt{\pi }}e^{-b}n^{-1/2}. \end{aligned}$$
(5)

Combining above facts gives

$$\begin{aligned} \mathbb {P}(N>n)\sim -\mathbb {E}S_N\sqrt{\frac{2}{\pi \sigma ^2}}n^{-1/2}= \mathbb {E}I \sqrt{\frac{2\lambda ^2}{\pi (c_a^2+c_s^2)}}n^{-1/2}. \end{aligned}$$
(6)

In the last step we apply [19, Thm. 3.1] together with the representation (3). Namely, observe that by (6) is regularly varying and hence of consistent variation. Moreover, \(\mathbb {E}V_1^2<\infty \) and \(x\mathbb {P}(V_1>x)=o(\mathbb {P}(N>x))\). Thus

$$\begin{aligned} \mathbb {P}(B>x)\sim \mathbb {P}(N>\lambda x) \end{aligned}$$

which completes the proof in view of (3). \(\square \)

From Theorem 2.1, one can recover known result for the M/G/1 queue. Indeed, in this case the idle period has an exponential distribution with the parameter \(\lambda >0\) and therefore \(\mathbb {E}I=\frac{1}{\lambda }\). Furthermore, \(c_a=1\) and hence

$$\begin{aligned} \mathbb {P}(B > x ) \sim \lambda ^{-1/2} \sqrt{\frac{2}{( 1 + c_s^2) \pi }}{x}^{-1/2} \end{aligned}$$

The M/G/1 result appeared in [21], but with a small typo for the constant in front of \(x^{-1/2}\).

In the case of G/M/1 queue, we have \(c_s=1\) and the first increasing ladder height \(H_1\) of the random walk (2) has exponential distribution with the parameter \(\lambda >0\) and therefore \(\mathbb {E}H_1=\frac{1}{\lambda }\). Moreover, by [8, eq. (4c)] we have

$$\begin{aligned} (c_a^2+c_s^2)\lambda ^{-2}=2\mathbb {E}H_1\mathbb {E}I=\frac{2}{\lambda }\mathbb {E}I \end{aligned}$$

and therefore

$$\begin{aligned} \mathbb {E}I=(c_a^2+c_s^2)\frac{1}{2\lambda }. \end{aligned}$$

Theorem 2.1 gives then

$$\begin{aligned} \mathbb {P}(B > x ) \sim \lambda ^{-1/2} \sqrt{\frac{( c_a^2+1)}{2 \pi }}{x}^{-1/2}. \end{aligned}$$

Finally we mention that the above agree with the M/M/1 case with \(\mathbb {P}(B > x ) \sim \lambda ^{-1/2} {(\pi x)}^{-1/2}\) which may be also obtained via asymptotics of Bessel functions as the distribution of B is exactly know.

3 BRAVO

We now establish BRAVO for the GI/G/1 with what we believe to be the minimal possible set of assumptions. The result below unifies Theorem 2.1 and Theorem 2.2 of [1] and also settles Conjecture 2.2 of that paper for the single-server case.

Theorem 3.1

Consider the GI/G/1 queue with \(\mathbb {E}[U_1] = \mathbb {E}[V_1]\) and \(\mathbb {E}[U_1^{2+\epsilon }],~ \mathbb {E}[V_1^{2+\epsilon }]< \infty \) for any \(\epsilon >0\). Then,

$$\begin{aligned} \lim _{t \rightarrow \infty } \frac{\mathrm{Var}\big (D(t)\big )}{\mathbb {E}[D(t)]} = (c_a^2+c_s^2) \big (1- \frac{2}{\pi } \big ). \end{aligned}$$

Proof

Following Theorem 2.1 of [1], we need to show uniformly integrability (UI) of

$$\begin{aligned} {{\mathcal {Q}}}:= \Bigg \{ \frac{Q(t)^2}{t},~t \ge t_0 \Bigg \} \end{aligned}$$

for some \(t_0>0\). Once we establish this UI, the result follows.

We can follow the same idea like one can find in the proof of [1, Thm. 2.2]. Namely, we can apply [1, Lem. 2.1] with \(r=2+\epsilon \) for \(\epsilon >0\) and [1, Prop. 4.1, Thm. 4.1, Thm. 4.2(ii) and Thm.4.3] exchanging 4th moments by \(2+\epsilon \) ones and \(8=2^3\) should be replaced by \(2^{1+\epsilon }\). There is one crucial difference though in the proof of [1, Thm. 4.1]. Namely, without using the Wald idenity used in [1], we have to show there that, \(\mathbb {E}[M_{V(t)}^{2+\epsilon }] = O(t^{1+\epsilon /2})\), where \(M_n = \sum _{i=1}^n \frac{1}{\mathbb {E}[\zeta _i]} \zeta _i - n\) is mean zero random walk with \(\zeta _i\) being i.i.d. random variables with \(2+\epsilon \) finite moments and \(V(t) = \inf \{n: \sum _{i=1}^n \zeta _i \ge t \}\) is the first passage time.

From the Marcinkiewicz–Zygmund inequality for a stopped random walk given in [12, Thm. I.5.1(iii), p. 22] with \(r=2+\epsilon \), we can conclude that

$$\begin{aligned} \mathbb {E} |M_{V(t)}|^{2+\epsilon }\le K\; \mathbb {E}\left| \zeta _1-\frac{1}{\mathbb {E}[\zeta _i]}\right| ^{2+\epsilon } \mathbb {E}V(t)^{(2+\epsilon )/2}= \mathrm{O}\left( \mathbb {E}V(t)^{1+\epsilon /2}\right) \end{aligned}$$

for some constant K. Moreover, from [12, Thm. III.8.1, p. 98] we know that \(\lim _{t\rightarrow \infty } \mathbb {E}V(t)^{1+\epsilon /2}/t^{1+\epsilon /2} <\infty \). Hence

$$\begin{aligned} \mathbb {E} |M_{V(t)}|^{2+\epsilon }=\mathrm{O}(t^{1+\epsilon /2}) \end{aligned}$$

which completes the proof. \(\square \)