Skip to main content
Log in

Subordination of trees and the Brownian map

  • Published:
Probability Theory and Related Fields Aims and scope Submit manuscript

Abstract

We discuss subordination of random compact \({\mathbb R}\)-trees. We focus on the case of the Brownian tree, where the subordination function is given by the past maximum process of Brownian motion indexed by the tree. In that particular case, the subordinate tree is identified as a stable Lévy tree with index 3/2. As a more precise alternative formulation, we show that the maximum process of the Brownian snake is a time change of the height process coding the Lévy tree. We then apply our results to properties of the Brownian map. In particular, we recover, in a more precise form, a recent result of Miller and Sheffield identifying the metric net associated with the Brownian map.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Abraham, C., Le Gall, J.-F.: Excursion theory for Brownian motion indexed by the Brownian tree. J. Eur. Math. Soc. (to appear). arXiv:1509.06616

  2. Addario-Berry, L., Albenque, M.: The scaling limit of random simple triangulations and random simple quadrangulations. Ann. Probab. (to appear). arXiv:1306.5227

  3. Bertoin, J., Le Gall, J.-F., Le Jan, Y.: Spatial branching processes and subordination. Can. Math. J. 49, 24–54 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  4. Curien, N., Kortchemski, I.: Random stable looptrees. Electron. J. Probab. 19(108), 1–35 (2014)

    MathSciNet  MATH  Google Scholar 

  5. Curien, N., Le Gall, J.-F.: The hull process of the Brownian plane. Probab. Theory Relat. Fields 166, 187–231 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  6. Duquesne, T.: The coding of compact real trees by real valued functions. Preprint, arXiv:math/0604106

  7. Duquesne, T., Le Gall, J.-F.: Random trees, Lévy processes and spatial branching processes. Astérisque 281, vi+147 (2002)

    MATH  Google Scholar 

  8. Duquesne, T., Le Gall, J.-F.: Probabilistic and fractal aspects of Lévy trees. Probab. Theory Relat. Fields 131, 553–603 (2005)

    Article  MATH  Google Scholar 

  9. Dynkin, E.B.: Branching particle systems and superprocesses. Ann. Probab. 19, 1157–1195 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  10. Evans, S.N.: Probability and Real Trees. Lectures from the 35th Saint-Flour Summer School on Probability Theory. Lecture Notes in Mathematics, vol. 1920. Springer, Berlin (2008)

    Google Scholar 

  11. Feller, W.: An Introduction to Probability Theory and Its Applications, vol. II. Wiley, New York (1971)

    MATH  Google Scholar 

  12. Grey, D.R.: Asymptotic behaviour of continuous time, continuous state-space branching processes. J. Appl. Probab. 11, 669–677 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  13. Le Gall, J.-F.: The Brownian snake and solutions of \(\Delta u = u^2\) in a domain. Probab. Theory Relat. Fields 102, 393–432 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  14. Le Gall, J.-F.: Spatial Branching Processes, Random Snakes and Partial Differential Equations. Lectures in Mathematics, ETH Zürich. Birkhäuser, Basel (1999)

    Book  Google Scholar 

  15. Le Gall, J.-F.: Random trees and applications. Probab. Surv. 2, 245–311 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  16. Le Gall, J.F.: Geodesics in large planar maps and in the Brownian map. Acta Math. 205, 287–360 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  17. Le Gall, J.-F.: Uniqueness and universality of the Brownian map. Ann. Probab. 41, 2880–2960 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  18. Le Gall, J.-F.: Brownian disks and the Brownian snake. Preprint, arXiv:1704.08987

  19. Le Gall, J.-F., Paulin, F.: Scaling limits of bipartite planar maps are homeomorphic to the \(2\)-sphere. Geom. Funct. Anal. 18, 893–918 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  20. Miermont, G.: The Brownian map is the scaling limit of uniform random plane quadrangulations. Acta Math. 210, 319–401 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  21. Miller, J., Sheffield, S.: An axiomatic characterization of the Brownian map. Preprint, arXiv:1506.03806

  22. Miller, J., Sheffield, S.: Liouville quantum gravity and the Brownian map I: the QLE(8/3,0) metric. Preprint, arXiv:1507.00719

  23. Miller, J., Sheffield, S.: Liouville quantum gravity and the Brownian map II: geodesics and continuity of the embedding. Preprint, arXiv:1605.03563

  24. Miller, J., Sheffield, S.: Liouville quantum gravity and the Brownian map III: the conformal structure is determined Preprint, arXiv:1608.05391

  25. Pardo, J.C., Rivero, V.: Self-similar Markov processes. Bol. Soc. Mat. Mexicana 19, 201–235 (2013)

    MathSciNet  MATH  Google Scholar 

  26. Sato, K.-I.: Lévy Processes and Infinitely Divisible Distributions. Cambrigde University Press, Cambridge (1999)

    MATH  Google Scholar 

  27. Weill, M.: Regenerative real trees. Ann. Probab. 35, 2091–2121 (2007)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

I thank the referee for a careful reading of the manuscript and for several useful suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jean-François Le Gall.

Appendix: On the special Markov property

Appendix: On the special Markov property

In this appendix, we derive a more precise and more general form of the special Markov property for the Brownian snake, which was first stated in [13]. This result is closely related to the special Markov property for superprocesses as stated by Dynkin [9, Theorem 1.6], but the formulation in terms of the Brownian snake, although less general, gives additional information that is crucial for our purposes.

We consider the setting of [14, Chapter V]. We let \(\xi \) be a Markov process with values in a Polish space (Ed) with continuous sample paths. For every \(x\in E\), the process \(\xi \) starts from x under the probability measure \(P_x\). Analogously to (19), we assume that the following strong continuity assumption holds for every \(x\in E\) and \(t\ge 0\),

$$\begin{aligned} E_x\left[ \left( \sup _{r\le t} d(x,\xi _r)\right) ^p\right] \le C\,t^{2+\varepsilon }, \end{aligned}$$
(22)

where \(C>0\), \(p>0\) and \(\varepsilon >0\) are constants. According to [14, Section IV.4], this continuity assumption allows us to construct the Brownian snake \((W_s)_{s\ge 0}\) with continuous sample paths with values in the space \(\mathcal {W}_E\) of all finite continuous paths in E (the set \(\mathcal {W}_E\) is defined by the obvious generalization of the beginning of Sect. 4, and we keep the notation \(\widehat{\mathrm {w}}\) for the tip of a path \(\mathrm {w}\in \mathcal {W}_E\)). The strong Markov property holds for \((W_s)_{s\ge 0}\), even without assuming that it holds for the underlying spatial motion \(\xi \). We again write \({\mathbb P}_x\) for the probability measure under which the Brownian snake starts from (the trivial path equal to) x, and \({\mathbb N}_x\) for the excursion measure of the Brownian snake away from x. It will also be useful to introduce conditional distributions of the Brownian snake given its lifetime process. If \(g:[0,\infty )\rightarrow [0,\infty )\) is a continuous function such that \(g(0)=0\) and g is locally Hölder with exponent \(\frac{1}{2}-\delta \) for every \(\delta >0\), we write \({\mathbb Q}^{(g)}_x\) for the conditional distribution under \({\mathbb P}_x\) of \((W_s)_{s\ge 0}\) knowing that \(\zeta _s=g(s)\) for every \(s\ge 0\). See [14, Chapter IV], and note that these conditional distributions are easily defined using the analog in our general setting of property (b) stated at the beginning of Sect. 4.

We now fix a connected open subset D of E and \(x\in D\). We use the notation \(\tau =\inf \{t\ge 0:\xi _t\notin D\}\), and, for every \(\mathrm {w}\in \mathcal {W}_E\), \(\tau (\mathrm {w})=\inf \{t\ge 0:\mathrm {w}(t)\notin D\}\), where in both cases \(\inf \varnothing =\infty \). We assume that

$$\begin{aligned} P_x(\tau <\infty )>0, \end{aligned}$$

and note that this implies that

$$\begin{aligned} \int _0^\infty \mathbf {1}_{\{\tau (W_s)<\zeta _s\}}\,\mathrm {d}s=\infty ,\quad P_x\hbox { a.s.} \end{aligned}$$

We set, for every \(s\ge 0\),

$$\begin{aligned} \eta _s:=\inf \left\{ t\ge 0: \int _0^t \mathbf {1}_{\{\zeta _r\le \tau (W_r)\}}\,\mathrm {d}r >s\right\} ,\quad W^D_s:=W_{\eta _s}. \end{aligned}$$

This definition makes sense \({\mathbb P}_x\) a.s. We let \(\mathcal {F}^D\) be the \(\sigma \)-field generated by the process \((W^D_{s})_{s\ge 0}\) and the collection of all \({\mathbb P}_x\)-negligible sets. Informally, \(\mathcal {F}^D\) represents the information provided by the paths \(W_s\) before they exit D.

Lemma 18

For every \(s\ge 0\), set \(\gamma _s=(\zeta _s-\tau (W_s))^+\), and

$$\begin{aligned} \sigma _s=\inf \left\{ t\ge 0: \int _0^t \mathbf {1}_{\{\gamma _r>0\}}\,\mathrm {d}r \ge s\right\} . \end{aligned}$$

Under the probability measure \({\mathbb P}_x\), we have \(\sigma _s<\infty \) for every \(s\ge 0\), a.s., and the process \(\Gamma _s:=\gamma _{\sigma _s}\) is distributed as a reflected Brownian motion in \({\mathbb R}_+\) and is independent of the \(\sigma \)-field \(\mathcal {F}^D\).

This is essentially Lemma V.2 in [14], except that the independence property is not stated in that lemma. However a close look at the proof in [14] shows that the process \(\Gamma _s\) is obtained as the limit of approximating processes (denoted by \(\gamma _{\sigma ^\varepsilon _s}\) in [14]) which are independent of \(\mathcal {F}^D\) thanks to the strong Markov property of the Brownian snake.

We write \(\ell ^D(s)\) for the local time at 0 of the process \(\Gamma \), and define a process with continuous nondecrasing sample paths by setting

$$\begin{aligned} L^D_s=\ell ^D\left( \int _0^s \mathbf {1}_{\{\gamma _r>0\}}\,\mathrm {d}r\right) . \end{aligned}$$

Then (see [14, Section V.1]),

$$\begin{aligned} L^D_s=\lim _{\varepsilon \rightarrow 0} \int _0^s \mathbf {1}_{\{\tau (W_r)<\zeta _r<\tau (W_r)+\varepsilon \}}\,\mathrm {d}r, \end{aligned}$$
(23)

for every \(s\ge 0\), \({\mathbb P}_x\) a.s. The process \((L^D_s)_{s\ge 0}\) is called the exit local time process from D. Notice that the measure \(\mathrm {d}L^D_s\) is supported on \(\{s\ge 0: \tau (W_s)=\zeta _s\}\).

We also set, for every \(s\ge 0\),

$$\begin{aligned} \widetilde{L}^D_s=: L^D_{\eta _s}. \end{aligned}$$

Lemma 19

The random process \((\widetilde{L}^D_s)_{s\ge 0}\) is measurable with respect to the \(\sigma \)-field \(\mathcal {F}^D\).

This follows from the proof of [13, Proposition 2.3] in the special case where \(\xi \) is d-dimensional Brownian motion. The argument however can be adapted to our more general setting and we omit the details.

Before stating the special Markov property, we need some additional notation. For every \(r\ge 0\), we set

$$\begin{aligned} \theta _r=\inf \left\{ s\ge 0: \widetilde{L}^D_s>r \right\} . \end{aligned}$$

We note that the process \((\theta _r)_{r\ge 0}\) is \(\mathcal {F}^D\)-measurable by Lemma 19.

We now define the excursions of the Brownian snake outside D. We observe that, \({\mathbb P}_x\) a.e., the set

$$\begin{aligned} \{s\ge 0: \tau (W_s)<\zeta _s\}=\{s\ge 0: \gamma _s>0\} \end{aligned}$$

is a countable union of disjoint open intervals, which we enumerate as \((a_i,b_i)\), \(i\in {\mathbb N}\). Here we can fix the enumeration by saying that we enumerate first the excursion intervals with length at most \(2^{-1}\) whose initial time is smaller than 2, then the excursion intervals with length at most \(2^{-2}\) whose initial time is smaller than \(2^2\) which have not yet been listed, and so on. If we choose this enumeration procedure, the variables \(a_i,b_i\) are measurable with respect to the \(\sigma \)-field generated by \((\gamma _s)_{s\ge 0}\), and the variables \(\int _0^{a_i} \mathbf {1}_{\{\gamma _r>0\}}\,\mathrm {d}r\) and \(L^D_{a_i}\) are measurable with respect to the \(\sigma \)-field generated by \((\Gamma _s)_{s\ge 0}\).

From the properties of the Brownian snake, one has, \({\mathbb P}_x\) a.e. for every \(i\in {\mathbb N}\) and every \(s\in [a_i,b_i]\),

$$\begin{aligned} \tau (W_s)=\tau (W_{a_i})=\zeta _{a_i}, \end{aligned}$$

and more precisely all paths \(W_s\), \(s\in [a_i,b_i]\) coincide up to their exit time from D. For every \(i\in {\mathbb N}\), we then define an element \(W^{(i)}\) of the space of all continuous functions from \({\mathbb R}_+\) into \(\mathcal {W}_E\) by setting, for every \(s\ge 0\),

$$\begin{aligned} W^{(i)}_s(t) := W_{(a_i+s)\wedge b_i}(\zeta _{a_i}+t),\quad \hbox {for } 0\le t\le \zeta ^{(i)}_s:=\zeta _{(a_i+s)\wedge b_i}-\zeta _{a_i}. \end{aligned}$$

By definition, the random variables \(W^{(i)}\), \(i\in {\mathbb N}\), are the excursions of the Brownian snake outside D (the word “outside” is a bit misleading here). Notice that, for every \(i\in {\mathbb N}\), \((\zeta ^{(i)}_s)_{s\ge 0}\) is a measurable function of \((\Gamma _s)_{s\ge 0}\). Indeed the processes \((\zeta ^{(i)}_s)_{s\ge 0}\), \(i\in {\mathbb N}\) are just the excursions of \(\Gamma \) away from 0 enumerated as explained above.

Theorem 20

Under \({\mathbb P}_x\), conditionally on the \(\sigma \)-field \(\mathcal {F}^D\), the point measure

$$\begin{aligned} \sum _{i\in {\mathbb N}} \delta _{(L^D_{a_i}, W^{(i)})}(\mathrm {d}\ell ,\mathrm {d}\omega ) \end{aligned}$$

is Poisson with intensity

$$\begin{aligned} \mathbf {1}_{[0,\infty )}(\ell )\,\mathrm {d}\ell \,{\mathbb N}_{\widehat{W}^D_{\theta _\ell }}(\mathrm {d}\omega ). \end{aligned}$$

Proof

It is convenient to introduce the auxiliary Markov process defined by \(\xi ^*_t=\xi _{t\wedge \tau }\). We observe that the Brownian snake associated with \(\xi ^*\) can be obtained by the formula

$$\begin{aligned} W^*_s(t)= W_s(t\wedge \tau (W_s)),\qquad 0\le t\le \zeta ^*_s=\zeta _s. \end{aligned}$$

Notice that \(\gamma _s=(\zeta _s-\tau (W_s))^+ = (\zeta ^*_s-\tau (W^*_s))^+\) is a measurable function of \(W^*\), and recall that the intervals \((a_i,b_i)\) are just the connected components of the complement of the zero set of \(\gamma \). Consider then, independently for every \(i\in {\mathbb N}\), a process \((\bar{W}^{(i)}_s)_{s\ge 0}\) which conditionally on \(W^*\) is distributed according to the probability measure

$$\begin{aligned} {\mathbb Q}_{\widehat{W}^*_{a_i}}^{(\zeta ^{(i)})}, \end{aligned}$$

where we recall our notation \({\mathbb Q}^{(g)}_x\) for the conditional distribution under \({\mathbb P}_x\) of \((W_s)_{s\ge 0}\) knowing that \(\zeta _s=g(s)\) for every \(s\ge 0\). Define \(\bar{W}_s\) for every \(s\ge 0\) by setting \(\bar{W}_s=W^*_s\) if \(\gamma _s=0\) and, for every \(i\in {\mathbb N}\), for every \(s\in (a_i,b_i)\),

$$\begin{aligned} \bar{W}_s(t)=\left\{ \begin{array}{l@{\quad }l} W^*_s(t)&{}\hbox {if }0\le t\le \tau (W^*_s),\\ \bar{W}^{(i)}_{s-a_i}(t-\tau (W^*_s))\quad &{}\hbox {if } \tau (W^*_s)\le t\le \zeta _s. \end{array} \right. \end{aligned}$$

A tedious but straightforward verification shows that the finite marginal distributions of the process \((\bar{W}_s)_{s\ge 0}\) are the same as those of the process \((W_s)_{s\ge 0}\). It follows that, conditionally on \((W^*_s)_{s\ge 0}\), the “excursions” \(W^{(i)}\) are independent and the conditional distribution of \(W^{(i)}\) is \({\mathbb Q}_{\widehat{W}^*_{a_i}}^{(\zeta ^{(i)})}\).

At this point, we claim that we have a.s. for every \(i\in {\mathbb N}\),

$$\begin{aligned} W^*_{a_i}=W_{a_i}= W^D_{\theta _{L^D_{a_i}}}. \end{aligned}$$
(24)

The first equality in (24) is immediate. To get the second one, set

$$\begin{aligned} A_{a_i}:=\int _0^{a_i}\mathbf {1}_{\{\zeta _r\le \tau (W_r)\}}\mathrm {d}r \end{aligned}$$

to simplify notation. We first note that

$$\begin{aligned} W_{a_i}= W_{b_i}= W^D_{A_{a_i}}, \end{aligned}$$

because \(\eta _{A_{a_i}}=b_i\) (the strong Markov property of the Brownian snake ensures that in each interval \([b_i,b_i+\varepsilon ]\), \(\varepsilon >0\), we can find a set of positive Lebesgue measure of values of s such that \(\tau (W_s)=\infty \)) and we know that \(W_{a_i}=W_{b_i}\). Thus our claim will follow if we can verify that

$$\begin{aligned} A_{a_i}=\theta _{L^D_{a_i}}. \end{aligned}$$

On one hand the condition \(s<A_{a_i}\) implies \(\eta _s<a_i\) and \(\widetilde{L}^D_s=L^D_{\eta _s}\le L^D_{a_i}\). It follows that \(A_{a_i}\le \theta _{L^D_{a_i}}\). On the other hand, suppose that \(\theta _{L^D_{a_i}}> A_{a_i}\). We first note that

$$\begin{aligned} \widetilde{L}^D_{A_{a_i}}= L^D_{\eta _{A_{a_i}}}= L^D_{b_i}=L^D_{a_i}, \end{aligned}$$

where the last equality holds by the support property of \(\mathrm {d}L^D_s\). Furthermore, the left limit of \(r\mapsto \theta _r\) at \(\widetilde{L}^D_{A_{a_i}}\) is smaller than or equal to \(A_{a_i}\) by construction. So the condition \(\theta _{L^D_{a_i}}> A_{a_i}\) means that \(L^D_{a_i}=\widetilde{L}^D_{A_{a_i}}\) is a discontinuity point of \(r\mapsto \theta _r\). However, we noticed that the random variables \(L^D_{a_i}\) are measurable functions of \((\Gamma _s)_{s\ge 0}\) and therefore independent of \(\mathcal {F}^D\). Since \((\theta _s)_{s\ge 0}\) is measurable with respect to \(\mathcal {F}^D\), and since \(s\mapsto \theta _r\) has only countably many discontinuity times, the (easy) fact that the law of \(L^D_{a_i}\) has no atoms implies that, with \({\mathbb P}_x\)-probability one, \(L^D_{a_i}\) cannot be a discontinuity time of \(r\mapsto \theta _r\). This contradiction completes the proof of our claim.

Next, let U be a bounded \(\mathcal {F}^D\)-measurable real random variable, and let g and G be nonnegative random variables defined respectively on \({\mathbb R}_+\) and on the space of continuous functions from \({\mathbb R}_+\) into \(\mathcal {W}_E\). By conditioning first with respect to \(W^*\), we get

$$\begin{aligned} {\mathbb E}_x\left[ U\times \exp \left( -\sum _{i\in {\mathbb N}} g(L^D_{a_i}) G(W^{(i)})\right) \right] = {\mathbb E}_x\left[ U\times \prod _{i\in {\mathbb N}} {\mathbb Q}_{\widehat{W}_{a_i}}^{(\zeta ^{(i)})}\left( e^{-g(L^D_{a_i}) G(\cdot )} \right) \right] ,\nonumber \\ \end{aligned}$$
(25)

noting that an \(\mathcal {F}^D\)-measurable real variable coincides \({\mathbb P}_x\) a.s. with a function of \(W^*\). Using (24), we see that, for every \(i\in {\mathbb N}\),

$$\begin{aligned} {\mathbb Q}_{\widehat{W}_{a_i}}^{(\zeta ^{(i)})}\Big (e^{-g(L^D_{a_i}) G(\cdot )} \Big )= {\mathbb Q}_{\widehat{W}^D_{\theta _{L^D_{a_i}}}}^{(\zeta ^{(i)})}\Big (e^{-g(L^D_{a_i}) G(\cdot )} \Big ) \end{aligned}$$

is a measurable function (that does not depend on i) of the pair \((L^D_{a_i}, (\zeta ^{(i)}_s)_{s\ge 0})\) and of the process \((W^D_{\theta _r})_{r\ge 0}\), which is \(\mathcal {F}^D\)-measurable. The point measure

$$\begin{aligned} \sum _{i\in {\mathbb N}} \delta _{(L^D_{a_i},(\zeta ^{(i)}_s)_{s\ge 0})}, \end{aligned}$$

which is just the point measure of excursions of the reflected Brownian motion \(\Gamma \), is Poisson with intensity \(\mathrm {d}\ell \,\mathbf{n}(\mathrm {d}e)\), where \(\mathbf{n}(\mathrm {d}e)\) is as previously the Itô excursion measure. Since the latter point measure is independent of \(\mathcal {F}^D\) (by Lemma 18), we can now condition with respect to \(\mathcal {F}^D\), applying the exponential formula for Poisson measures, to get that the quantities in (25) are equal to

$$\begin{aligned} {\mathbb E}_x\left[ U\times \exp \left( -\int _0^\infty \mathrm {d}\ell \, {\mathbb N}_{\widehat{W}^D_{\theta _\ell }}\left( 1- e^{-g(\ell )G(\cdot )}\right) \right) \right] . \end{aligned}$$

The statement of the theorem follows. \(\square \)

In the preceding sections, we use a version of Theorem 20 under the excursion measure \({\mathbb N}_x\), which we will now state as a corollary. We set

$$\begin{aligned} T_D=\inf \{t\ge 0: \tau (W_s)<\infty \}, \end{aligned}$$

and observe that

$$\begin{aligned} 0<{\mathbb N}_x(T_D<\infty )<\infty \end{aligned}$$

(if the quantity \({\mathbb N}_x(T_D<\infty )\) were infinite, excursion theory would give a contradiction with the fact that the Brownian snake has continuous paths under \({\mathbb P}_x\)). Then we note that the conditional probability measure \({\mathbb N}_x^D:={\mathbb N}_x(\cdot \mid T_D<\infty )\) can be interpreted as the law under \({\mathbb P}_x\) of the first Brownian excursion away from x that exits D. Thanks to this observation, we can make sense of the exit local time process \((L^D_s)_{s\ge 0}\) under \({\mathbb N}^D_x\) by formula (23). The definition of \((\eta _s)_{s\ge 0}\) and \((W^D_s)_{s\ge 0}\) remains the same, and we can set \(\widetilde{L}^D_s=L^D_{\eta _s}\) as previously. The difference of course is the fact that \(\widetilde{L}^D_\infty =L^D_\infty = L^D_\sigma \) is now finite \({\mathbb N}^D_x\) a.s. So we may define \(\theta _s=\inf \{r>0: \widetilde{L}^D_r>s\}\) only for \(r<L^D_\sigma \).

After these observations, we may state the version of the special Markov property under \({\mathbb N}^D_x\). We slightly abuse notation by still writing \((W^{(i)})_{i\in {\mathbb N}}\) and \((a_i,b_i)_{i\in {\mathbb N}}\) for the excursions outside D and the associated intervals, which are defined in exactly the same way as previously. The \(\sigma \)-field \(\mathcal {F}^D\) is again the \(\sigma \)-field generated by \((W^D_s)_{s\ge 0}\) but should be completed here with the \({\mathbb N}_x\)-negligible sets.

Corollary 21

Under \({\mathbb N}^D_x\), conditionally on the \(\sigma \)-field \(\mathcal {F}^D\), the point measure

$$\begin{aligned} \sum _{i\in {\mathbb N}} \delta _{(L^D_{a_i}, W^{(i)})}(\mathrm {d}\ell ,\mathrm {d}\omega ) \end{aligned}$$

is Poisson with intensity

$$\begin{aligned} \mathbf {1}_{[0,L^D_\sigma ]}(\ell )\,\mathrm {d}\ell \,{\mathbb N}_{\widehat{W}^D_{\theta _\ell }}(\mathrm {d}\omega ). \end{aligned}$$

Note that \(L^D_\sigma =\widetilde{L}^D_\infty \) is \(\mathcal {F}^D\)-measurable. The corollary is a straightforward consequence of Theorem 20, interpreting \({\mathbb N}^D_x\) as the law under \({\mathbb P}_x\) of the first Brownian excursion away from x that exits D. We omit the details.

If we are only interested in the point measure \(\sum _{i\in I}\delta _{W^{(i)}}\), we can state the preceding corollary in a slightly different form, by introducing the notion of the exit measure: The exit measure from D is the random measure \(\mathcal {Z}^D\) on \(\partial D\) defined (under \({\mathbb N}_x\) or under \({\mathbb N}^D_x\)) by the formula

$$\begin{aligned} \langle \mathcal {Z}^D,g\rangle =\int _0^\infty \mathrm {d}L^D_s\, g(\widehat{W}_s). \end{aligned}$$

See [14, Section V.1]. Note that the total mass of \(\mathcal {Z}^D\) is \(L^D_\sigma \). A change of variables shows that we have as well

$$\begin{aligned} \langle \mathcal {Z}^D,g\rangle = \int _0^\infty \mathrm {d}\widetilde{L}^D_s\,g(\widehat{W}^D_s)= \int _0^{L^D_\sigma } \mathrm {d}\ell \,g(\widehat{W}^D_{\theta _\ell }). \end{aligned}$$

In particular, \(\mathcal {Z}^D\) is \(\mathcal {F}^D\)-measurable by Lemma 19. The following result is now an immediate consequence of Corollary 21.

Corollary 22

Under \({\mathbb N}^D_x\), conditionally on the \(\sigma \)-field \(\mathcal {F}^D\), the point measure

$$\begin{aligned} \sum _{i\in {\mathbb N}} \delta _{ W^{(i)}}(\mathrm {d}\omega ) \end{aligned}$$

is Poisson with intensity

$$\begin{aligned} \int \mathcal {Z}^D(\mathrm {d}y)\,{\mathbb N}_y(\mathrm {d}\omega ). \end{aligned}$$

This is the form of the special Markov property that appears in [13].

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Le Gall, JF. Subordination of trees and the Brownian map. Probab. Theory Relat. Fields 171, 819–864 (2018). https://doi.org/10.1007/s00440-017-0794-9

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00440-017-0794-9

Mathematics Subject Classification

Navigation