Abstract
We develop an exact sampling algorithm for the all-time maximum of Gaussian processes with negative drift and general covariance structures. In particular, our algorithm can handle non-Markovian processes even with long-range dependence. Our development combines a milestone-event construction with rare-event simulation techniques. This allows us to find a random time beyond which the running time maximum will never be reached again. The complexity of the algorithm is random but has finite moments of all orders. We also test the performance of the algorithm numerically.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Addie, R., Mannersalo, P., Norros, I.: Most probable paths and performance formulae for buffers with Gaussian input traffic. Eur. Trans. Telecommun. 13(3), 183–196 (2002)
Adler, R.J., Blanchet, J.H., Liu, J.: Efficient Monte Carlo for high excursions of Gaussian random fields. Ann. Appl. Probab. 22(3), 1167–1214 (2012)
Alvarez-Lacalle, E., Dorow, B., Eckmann, J.P., Moses, E.: Hierarchical structures induce long-range dynamical correlations in written texts. Proc. Natl. Acad. Sci. 103(21), 7956–7961 (2006)
Ambikasaran, S., Foreman-Mackey, D., Greengard, L., Hogg, D.W., O’Neil, M.: Fast direct methods for Gaussian processes. IEEE Trans. Pattern Anal. Mach. Intell. 38(2), 252–265 (2016). https://doi.org/10.1109/TPAMI.2015.2448083
Asmussen, S.: Applied Probability and Queues, 2nd edn. Springer (2003)
Bayer, C., Friz, P., Gatheral, J.: Pricing under rough volatility. Quant. Financ. 16(6), 887–904 (2016)
Beran, J.: Statistical Methods for Data with Long-Range Dependence. Statistical Science, pp. 404–416 (1992)
Beran, J., Sherman, R., Taqqu, M.S., Willinger, W.: Long-range dependence in variable-bit-rate video traffic. IEEE Trans. Commun. 43(2/3/4), 1566–1579 (1995)
Blanchet, J., Chen, X., Dong, J.: \(\varepsilon \)-Strong simulation for multidimensional stochastic differential equations via rough path analysis. Ann. Appl. Probab. 27(1), 275–336 (2017)
Blanchet, J., Dong, J.: Perfect sampling for infinite server and loss systems. Adv. Appl. Probab. 47(3), 761–786 (2015)
Blanchet, J., Dong, J., Liu, Z.: Exact sampling of the infinite horizon maximum of a random walk over a nonlinear boundary. J. Appl. Probab. 56(1), 116–138 (2019)
Blanchet, J., Li, C.: Efficient simulation for the maximum of infinite horizon discrete-time Gaussian processes. J. Appl. Probab. 48, 467–489 (2011)
Blanchet, J., Sigman, K.: On exact sampling of stochastic perpetuities. J. Appl. Probab. 48(A), 165–182 (2011)
Blanchet, J., Wallwater, A.: Exact sampling of stationary and time-reversed queues. ACM Trans. Model. Comput. Simul. 25(4), 26 (2015)
Bucklew, J.A., Radeke, R.: On the Monte Carlo simulation of digital communication systems in Gaussian noise. IEEE Trans. Commun. 51(2), 267–274 (2003)
Chen, Y., Dong, J., Ni, H.: \(\varepsilon \)-strong simulation of fractional Brownian motion and related stochastic differential equations. Mathematics of Operations Research (2021)
Devroye, L.: Non-Uniform Random Variate Generation. Springer (1986)
Dieker, A.: Simulation of fractional Brownian motion. Ph.D. thesis, Masters Thesis, Department of Mathematical Sciences, University of Twente (2004)
Dieker, A.B., Mandjes, M.: On spectral simulation of fractional Brownian motion. Probab. Eng. Inf. Sci. 17(3), 417–434 (2003)
Dieker, A.B., Mandjes, M.: Fast simulation of overflow probabilities in a queue with Gaussian input. ACM Trans. Model. Comput. Simul. 16(2), 119–151 (2006)
Dietrich, C., Newsam, G.N.: Fast and exact simulation of stationary Gaussian processes through circulant embedding of the covariance matrix. SIAM J. Sci. Comput. 18(4), 1088–1107 (1997)
Dombry, C., Engelke, S., Oesting, M.: Exact simulation of max-stable processes. Biometrika 103(2), 303–317 (2016)
Doukhan, P., Oppenheim, G., Taqqu, M.: Theory and Applications of Long-Range Dependence. Springer Science & Business Media (2002)
Ensor, K., Glynn, P.: Simulating the maximum of a random walk. J. Stat. Plann. Inference 85, 127–135 (2000)
Gatheral, J., Jaisson, T., Rosenbaum, M.: Volatility is rough. Quant. Financ. 18(6), 933–949 (2018)
Glynn, P.W., Whitt, W.: The asymptotic efficiency of simulation estimators. Oper. Res. 40(3), 505–520 (1992)
Heyde, C., Yang, Y.: On defining long-range dependence. J. Appl. Probab. 34, 939–944 (1997)
Huang, C., Devetsikiotis, M., Lambadaris, I., Kaye, A.: Fast simulation of queues with long-range dependent traffic. Stoch. Model. 15(3), 429–460 (1999)
Hurst, H.E.: Long-term storage capacity of reservoirs. Trans. Am. Soc. Civ. Eng. 116(1), 770–799 (1951)
Jean-Francois, C.: Simulation and identification of the fractional Brownian motion: a bibliographical and comparative study. J. Stat. Softw. 5, 1–53 (2000)
Karagiannis, T., Molle, M., Faloutsos, M.: Long-range dependence ten years of internet traffic modeling. IEEE Internet Comput. 8(5), 57–64 (2004)
Lau, W.C., Erramilli, A., Wang, J.L., Willinger, W.: Self-similar traffic generation: The random midpoint displacement algorithm and its properties. In: Proceedings IEEE International Conference on Communications ICC’95, vol. 1, pp. 466–472. IEEE (1995)
Liu, Z., Blanchet, J., Dieker, A., Mikosch, T.: On logrithmically optimal exact simulation of max-stable and related random fields on a compact set. Bernoulli 25(4A), 2949–2981 (2019)
Norros, I., Mannersalo, P., Wang, J.L.: Simulation of fractional Brownian motion with conditionalized random midpoint displacement. Adv. Perform. Anal. 2(1), 77–101 (1999)
Robinson, P.M.: Gaussian Semiparametric Estimation of Long Range Dependence. The Annals of Statistics, pp. 1630–1661 (1995)
Robinson, P.M.: Time Series with Long Memory. Advanced Texts in Econometrics (2003)
Samorodnitsky, G.: Long Range Dependence. now Publishers Inc (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix
Appendix
1.1 Proof of Lemma 1
Proof
Note that \(S_{k}\) conditional on \(\mathcal {S}_{n}\) is still a Gaussian random variable with conditional mean
and conditional variance
The proof of the lemma is divided into three steps. We first establish bounds for the conditional mean \(\mu _{n}(k)\). Let \(\tilde{\mu }_{n}(k)=\boldsymbol{U}_{nk}^{\top }\boldsymbol{\Sigma }_{n}^{-1}\tilde{\mathcal {S}}_{n}\). As \(\tilde{\mu }_{n}(k)\) is a linear combination of \(\tilde{\mathcal {S}}_{n}\), it follows a Normal distribution with mean 0 and variance \(\boldsymbol{U}_{nk}^{\top }\boldsymbol{\Sigma }_{n}^{-1}\boldsymbol{U}_{nk}\). By the law of total variance, \(\boldsymbol{U}_{nk}^{\top }\boldsymbol{\Sigma }_{n}^{-1}\boldsymbol{U}_{nk}<\sigma ^{2}k^{2H}\). In this case, for any fixed \(\delta \in (0,\mu )\),
Then,
By Borel-Cantelli Lemma, there exits a random number \(L_{0}\ge n\), which is finite almost surely, such that when \(k>L_{0}\), \(\tilde{\mu }_{n}(k)\le \delta k\), which further implies that \(\mu _{n}(k)\le -(\mu -\delta )k\).
We next establish bounds for \(\sum _{n=1}^{\infty }q(n)\). For \(k>L_{0}\), we have \(\mu _{n}(k)\le -(\mu -\delta )k\) and \(\sigma _{n}(k)^2\le \sigma ^{2}k^{2H}\). Thus, for any \(b\ge 0\),
Based on the analysis above, let \(b=\max _{1\le l\le n} S_{l}\). We decompose \(\sum _{n=1}^{\infty }q(n)\) into three parts:
Part (I) only involves a finite number of terms. For part (II), from (12), we have
Similarly, for part (III), from (12), we have
Putting parts (I)–(III) together, we have \(\sum _{n=1}^{\infty }q(n)<\infty \). By Borell-Cantelli Lemma, there exits L, which is finite almost surely, such that for any \(n>L\), \(q(n)<a\).
Lastly, we show that \(\mathbb {E}[L^{\eta }]<\infty \) for any \(\eta >0\). Let \(L_1\) denote a large enough constant, such that \(\sum _{k=L_1}^{\infty }\bar{\Phi }\left( \frac{\mu -\delta }{\sigma }k^{1-H}\right) <a\). Then, \(L\le \max \{L_{0},L_1\}\). Thus, to prove \(\mathbb {E}[L^{\eta }]<\infty \), we only need to show that \(\mathbb {E}[L_{0}^{\eta }]<\infty \). Define \(\mathcal {A}_{n}=\bigcup _{k=n}^{\infty }\{\tilde{\mu }_{n}(k)>\delta k\}\). Then \(L_{0}^{\eta }\le \sum _{n=1}^{\infty }1\{\mathcal {A}_{n}\}n^{\eta }\), and
where the last inequality follows from (11). \(\square \)
1.2 Proof of Lemma 2
Proof
With a little abuse of notation, we denote \(\mathbb {Q}_n\) as the measure induced by the TBS procedure. First note that
Thus, \(\frac{\textrm{d}\mathbb {P}_{n}}{\textrm{d}\mathbb {Q}_n}(S_{n+1},...,S_{\kappa _n}, \kappa _n<\infty )=\frac{\sum _{\ell =n+1}^{\infty }\mathbb {P}_{n}(S_{\ell }>b)}{\sum _{m=\kappa _n}^{\infty }\mathbb {P}_{\kappa _n}(S_{m}>b)}\). \(\square \)
1.3 Proof of Lemma 3
Proof
Let \(\mathbb {E}_{\mathbb {Q}}\) denote the expectation under measure \(\mathbb {Q}\). Suppose \(M_n=b\). First note that by Lemma 2,
Next, by Bayes rule,
As \(\mathbb {Q}_n(I=1|\kappa _n,(S_{n+1},...,S_{\kappa _n}))=\frac{1}{\sum _{\ell =\tau (b)}^{\infty }\mathbb {P}_{\kappa _n}(S_{\ell }>b)}\), plugging (13) in (14), we have
\(\square \)
1.4 Proof of Lemma 4
Proof
Given \(\mathcal {S}_n\), suppose \(M_n=b\). We also define
Note that for any \(k>n\), \(S_{k}\) conditional on \(\mathcal {S}_{n}\) is still a Gaussian random variable with conditional mean \(\mu _{n}(k)=\mathbb {E}[S_{k}|\mathcal {S}_{n}]=-k\mu +\boldsymbol{U}_{nk}^{\top }\boldsymbol{\Sigma }_{n}^{-1}\tilde{\mathcal {S}}_{n}\), and conditional variance \(\sigma _{n}(k)^{2}={\text {Var}}[S_{k}|\mathcal {S}_{n}]=\sigma ^{2}k^{2H}-\boldsymbol{U}_{nk}^{\top }\boldsymbol{\Sigma }_{n}^{-1}\boldsymbol{U}_{nk}\).
We first establish the sequence of bounds. The lower bound is straightforward. For the upper bound, note that for \(k\ge N_{1}\),
Next, note that for \(k\ge \max \{N_{1}, N_{2}\}\),
To see the second inequality, note that when \(k\ge N_{1}\), \(\mu _{n}(k)\le -k\mu /2\) and \(\sigma _{n}(k)\le \sigma k^{H}\). Thus, \(\frac{b-\mu _{n}(k)}{\sigma _{n}(k)} \ge \frac{b+k\mu }{2\sigma k^{H}}\ge \frac{\mu }{2\sigma }k^{1-H}\). And for \(k\ge N_{2}\), \(\frac{1}{\sqrt{2\pi }}\left( \frac{\mu }{2\sigma }k^{1-H}\right) ^{-1} \le 1\).
Lastly, we have for \(\ell \ge \max \{N_{1},N_{2},N_{3}\}\),
For \(\mathbb {E}[B(n)^{\eta }]\), we first note that \(N_{2}\) and \(N_{3}\) are finite constants. Thus, we only need to show that \(\mathbb {E}[N_{1}^{\eta }]<\infty \). For any fixed n,
\(\square \)
1.5 Proof of Lemma 5
Proof
Given \(\kappa _n\) and \(\mathcal {S}_{\kappa _n}\), suppose \(M_n=b\). First note that
Next, following the proof of Lemma 4, we have for \(\ell \ge B(\kappa _n)\),
Since \(\mathbb {P}_{k}(S_{k}>b)=1\), \(p(k)=(1+\sum _{i=k+1}^{\infty } \mathbb {P}_{k}(S_{i}>b))^{-1}\), and for \(\ell \ge B(\kappa _n)\), \((1+\tilde{q}(n,\ell )+h(\ell ))^{-1} \le p(\kappa _n)\le (1+\tilde{q}(n,\ell ))^{-1}\). The rest of the results follow similarly. \(\square \)
1.6 Proof of Lemma 6
Proof
We first note that in Step 2.1 in Algorithm 2, \(\mathbb {P}_{n}(N(n)=\ell , J=1)= \mathbb {P}_{n}(S_{\ell }>M_{n})\). Next, following the same lines of analysis as the proof of Lemma 1, we have for any \(\delta >0\), there exists \(L_{0}>0\) such that for \(\ell >L_{0}\), \(\mathbb {P}_{n}(S_{\ell }>M_{n})\le \bar{\Phi }\left( \frac{\mu -\delta }{\sigma } \ell ^{1-H}\right) \), and for any \(\eta >0\), \(\mathbb {E}[L_{0}^{\eta }]<\infty \). Then for any \(\eta >0\),
Thus,
\(\square \)
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Blanchet, J., Chen, L., Dong, J. (2022). Exact Sampling for the Maximum of Infinite Memory Gaussian Processes. In: Botev, Z., Keller, A., Lemieux, C., Tuffin, B. (eds) Advances in Modeling and Simulation. Springer, Cham. https://doi.org/10.1007/978-3-031-10193-9_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-10193-9_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-10192-2
Online ISBN: 978-3-031-10193-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)