## 1 Introduction

The cost of a free, probability measure preserving (p.m.p.) action of a group is an orbit-equivalence invariant that was introduced by Levitt [29] and studied extensively by Gaboriau [13, 14, 18]. Gaboriau used the notion of cost to prove several remarkable theorems, including that free groups of different ranks cannot have orbit equivalent free, ergodic, p.m.p. actions. This result is in stark contrast with the amenable case, in which Ornstein and Weiss [36] proved that any two free, ergodic p.m.p. actions are orbit equivalent. These results sparked a surge of interest in the cost of group actions, the fruits of which are summarised in the monographs and surveys [11, 18, 26, 27].

The cost of a group is defined to be the infimal cost of all free, ergodic p.m.p. actions of the group. We will employ here the following probabilistic definition, which is shown to be equivalent to the classical definition in [27, Proposition 29.5]. Let $$\Gamma$$ be a countable group. We define $${\mathcal {S}}(\Gamma )$$ to be the set of connected spanning graphs on $$\Gamma$$, that is, the set of connected, undirected, simple graphs with vertex set $$\Gamma$$. Formally, we define $${\mathcal {S}}(\Gamma )$$ to be the set of $$\omega \in \{0,1\}^{\Gamma \times \Gamma }$$ such that $$\omega (a,b)=\omega (b,a)$$ for every $$a,b \in \Gamma$$ and such that for each $$a,b \in \Gamma$$ there exists $$n\ge 0$$ and a sequence $$a=a_0,a_1,\ldots ,a_n =b$$ in $$\Gamma$$ such that $$\omega (a_{i-1},a_{i})=1$$ for every $$1\le i \le n$$. We equip $$\{0,1\}^{\Gamma \times \Gamma }$$ with the product topology and associated Borel $$\sigma$$-algebra, and equip $${\mathcal {S}}(\Gamma )$$ with the subspace topology and Borel $$\sigma$$-algebra. Note that $${\mathcal {S}}(\Gamma )$$ is not closed in $$\{0,1\}^{\Gamma \times \Gamma }$$ when $$\Gamma$$ is infinite. For each $$\omega \in {\mathcal {S}}(\Gamma )$$ and $$\gamma \in \Gamma$$ we define $$\gamma \omega$$ by setting $$\gamma \omega (u,v) = \omega (\gamma ^{-1} u, \gamma ^{-1} v)$$. We say that a probability measure on $${\mathcal {S}}(\Gamma )$$ is $$\Gamma$$-invariant if $$\mu (\mathscr {A})= \mu (\gamma ^{-1} \mathscr {A})$$ for every Borel set $$\mathscr {A}\subseteq {\mathcal {S}}(\Gamma )$$, and write $$M(\Gamma ,{\mathcal {S}}(\Gamma ))$$ for the set of $$\Gamma$$-invariant probability measures on $${\mathcal {S}}(\Gamma )$$. The cost of the group$$\Gamma$$ can be defined to be

\begin{aligned} {\text {cost}}(\Gamma ) = \frac{1}{2} \inf \left\{ \int _{\omega \in {\mathcal {S}}(\Gamma )} \deg _\omega (o) \hbox {d}\mu (\omega ) : \mu \in M(\Gamma ,{\mathcal {S}}(\Gamma )) \right\} , \end{aligned}
(1.1)

where o is the identity element of $$\Gamma$$ and $$\deg _\omega (o)$$ is the degree of o in the graph $$\omega \in {\mathcal {S}}(\Gamma )$$. Note that for nonamenable groups with cost 1, and more generally for any non-treeable group, the infimum in (1.1) is not attained [27, Propositions 30.4 and 30.6].

Every countably infinite amenable group has cost 1 by Orstein-Weiss [36] (see [6, Section 5] for a probabilistic proof), while the free group $${\mathbb {F}}_k$$ has cost k [12]. There are also however many nonamenable groups with cost 1, including the direct product $$\Gamma _1 \times \Gamma _2$$ of any two countably infinite groups $$\Gamma _1$$ and $$\Gamma _2$$ [27, Theorem 33.3] (which is nonamenable if at least one of $$\Gamma _1$$ or $$\Gamma _2$$ is nonamenable), and $$\mathrm {SL}_d(\mathbb {Z})$$ with $$d\ge 3$$ [13]. See [12] for many further examples.

In general, computing the cost of a group is not an easy task. Nevertheless, one possible approach is suggested by the following question of Gaboriau, which connects the cost to the first$$\ell ^2$$-Betti number$$\beta _1(\Gamma )$$ of the group. This is a measure-equivalence invariant of the group that can be defined to be the von Neumann dimension of the space of harmonic Dirichlet functions of any Cayley graph of the group. Equivalently, $$\beta _1(\Gamma )$$ can be defined in terms of the expected degree of the free uniform spanning forest in any Cayley graph of $$\Gamma$$ by the equality $$\mathbb {E}\deg _{\mathsf {FUSF}}(o)=2+2\beta _1(\Gamma )$$, see [32, Section 10.8]. Gaboriau [14] proved that $${\text {cost}}(\Gamma ) \ge 1 + \beta _1(\Gamma )$$ and asked whether this inequality is ever strict.

### Question 1.1

(Gaboriau) Is $${\text {cost}}(\Gamma ) = 1 + \beta _1(\Gamma )$$ for every countably infinite group $$\Gamma$$?

For groups with Kazhdan’s property (T), defined below, it was proven by Bekka and Valette that $$\beta _1=0$$ [5]. However, in spite of several works connecting property (T), cost, and percolation theory [17, 21, 31, 33], the cost of Kazhdan groups has remained elusive [18, Question 6.4], and has thus become a famous test example for Question 1.1. This paper addresses this question.

### Theorem 1.2

Let $$\Gamma$$ be a countably infinite Kazhdan group. Then $$\Gamma$$ has cost 1.

In fact, our proof gives slightly more: It is a classical result of Kazhdan that every countable Kazhdan group is finitely generated [4, Theorem 1.3.1]. The proof of Theorem 1.2 establishes that for every $$\varepsilon >0$$ and every finite symmetric generating set S of $$\Gamma$$, there is a $$\Gamma$$-invariant measure on connected, spanning subgraphs of the associated Cayley graph with average degree at most $$2+\varepsilon$$.

Our proof will apply the following probabilistic characterization of property (T) due to Glasner and Weiss [19], which the reader may take as the definition of property (T) for the purposes of this paper. Let $$\Gamma$$ be a countable group, and let $$\Gamma \curvearrowright X$$ be an action of $$\Gamma$$ by homeomorphisms on a topological space X. We write $$M(\Gamma ,X)$$ for the space of $$\Gamma$$-invariant Borel probability measures on X, which is equipped with the $$\hbox {weak}^*$$ topology, and write $$E(\Gamma ,X) \subseteq M(\Gamma ,X)$$ for the subspace of ergodic$$\Gamma$$-invariant Borel probability measures on X. Here, we recall that an event $$\mathscr {A}\subseteq X$$ is said to be invariant if $$\gamma \mathscr {A}= \mathscr {A}$$ for every $$\gamma \in \Gamma$$, and that a measure $$\mu \in M(\Gamma ,X)$$ is said to be ergodic if $$\mu (\mathscr {A}) \in \{0,1\}$$ for every invariant event $$\mathscr {A}$$.

### Theorem 1.3

(Glasner and Weiss 1997) Let $$\Gamma$$ be a countably infinite group, and consider the natural action of $$\Gamma$$ on $$\Omega =\{0,1\}^\Gamma$$. Then the following are equivalent.

1. 1.

$$\Gamma$$ has Kazhdan’s property (T).

2. 2.

$$E(\Gamma ,\Omega )$$ is closed in $$M(\Gamma ,\Omega )$$.

3. 3.

$$E(\Gamma ,\Omega )$$ is not dense in $$M(\Gamma ,\Omega )$$.

See e.g. [4] for further background on Kazhdan groups.

It remains open if Kazhdan groups have fixed price 1, i.e., if every free ergodic p.m.p. action has cost 1. (In contrast, Theorem 1.2 implies that there exists a free, ergodic p.m.p. action with cost 1, see [27, Proposition 29.1].) Abért and Weiss [2] proved that Bernoulli actions have maximal cost among all free ergodic p.m.p. actions of a given group, and probabilistically this means that the maximal cost of the free ergodic p.m.p. actions of a countable group $$\Gamma$$ is equal to

\begin{aligned} {\text {cost}}^*(\Gamma ) = \frac{1}{2} \inf \left\{ \int _{\omega \in {\mathcal {S}}(\Gamma )} \deg _\omega (o) \hbox {d}\mu (\omega ) : \mu \in F_{\mathrm {IID}}(\Gamma ,{\mathcal {S}}(\Gamma )) \right\} , \end{aligned}
(1.2)

where $$F_{\mathrm {IID}}(\Gamma ,{\mathcal {S}}(\Gamma ))\subseteq M(\Gamma ,{\mathcal {S}}(\Gamma ))$$ is the set of $$\Gamma$$-invariant measures on $$S(\Gamma )$$ that arise as factors of i.i.d. processes on $$\Gamma$$. Our construction is very far from being a factor of i.i.d., and therefore seems unsuitable to study $${\text {cost}}^*(\Gamma )$$. See Remark 4.4 for further discussion. The question of fixed price 1 for Kazhdan groups is of particular interest due to its connection to the Abért-Nikolov rank gradient conjecture [1, Conjecture 17].

An extension of our results to groups with relative property (T) is sketched in Sect. 3.

## 2 Proof

### 2.1 A reduction

We begin our proof with the following proposition, which shows that it suffices for us to find sparse random graphs on $$\Gamma$$ that have a unique infinite connected component. We define $${\mathcal {U}}(\Gamma ) \subseteq \{0,1\}^{\Gamma \times \Gamma }$$ to be the set of graphs on $$\Gamma$$ that have a unique infinite connected component.

### Proposition 2.1

Let $$\Gamma$$ be an infinite, finitely generated group. Then

\begin{aligned} {\text {cost}}(\Gamma ) \le 1 + \frac{1}{2} \inf \left\{ \int _{\omega \in {\mathcal {U}}(\Gamma )} \deg _\omega (o) \hbox {d}\mu (\omega ) : \mu \in M(\Gamma ,{\mathcal {U}}(\Gamma )) \right\} . \end{aligned}

Proposition 2.1 can be easily deduced from the induction formula of Gaboriau [13, Proposition II.6]. We provide a direct proof for completeness.

### Proof

Take a Cayley graph G corresponding to a finite symmetric generating set of $$\Gamma$$. Let $$\mu \in M(\Gamma ,{\mathcal {U}}(\Gamma ))$$, let $$\omega$$ be a random variable with law $$\mu$$, and let $$\eta _0$$ be the set of vertices of its unique infinite connected component. For each $$i\ge 1$$, let $$\eta _i$$ be the set of vertices in G that have graph distance exactly i from $$\eta _0$$ in G. Note that $$\bigcup _{i\ge 0} \eta _i = \Gamma$$, and that if $$i\ge 1$$ then every vertex in $$\eta _i$$ has at least one neighbour in $$\eta _{i-1}$$. For each $$i\ge 1$$ and each vertex $$v \in \eta _i$$, let $$e^\rightarrow (v)$$ be chosen uniformly at random from among those oriented edges of G that begin at v and end at a vertex of $$\eta _{i-1}$$, and let e(v) be the unoriented edge obtained by forgetting the orientation of $$e^\rightarrow (v)$$. These choices are made independently conditional on $$\omega$$. We define $$\zeta =\{e(v) : v\in V\setminus \eta _0\}$$ and define $$\nu$$ to be the law of $$\xi =\omega \cup \zeta$$. We clearly have that $$\xi$$ is in $${\mathcal {S}}(\Gamma )$$ whenever $$\omega \in {\mathcal {U}}(\Gamma )$$, and hence that $$\nu \in M(\Gamma ,{\mathcal {S}}(\Gamma ))$$. On the other hand, the mass-transport principle (see [32, Section 8.1]) implies that, writing $$\mathbb {P}$$ and $$\mathbb {E}$$ for probabilities and expectations taken with respect to the joint law of $$\omega$$ and $$\{e(v) : v \in V \setminus \eta \}$$,

\begin{aligned} \mathbb {E}\deg _\zeta (o) = \mathbb {P}(o\notin \eta _0) + \mathbb {E}\sum _{v\in V} {1}\bigl (v\notin \eta _0,\, e^\rightarrow (v)^+\!=o\bigr ) = 2\mathbb {P}(o\notin \eta _0) \le 2, \end{aligned}

where $$e^\rightarrow (v)^+$$ denotes the other endpoint of $$e^\rightarrow (v)$$. We deduce that

\begin{aligned} \int _{\xi \in {\mathcal {S}}(\Gamma )} \deg _\xi (o) \hbox {d}\nu (\xi ) = \mathbb {E}\deg _\zeta (o)+ \mathbb {E}\deg _\omega (o) \le 2 + \int _{\omega \in {\mathcal {U}}(\Gamma )} \deg _\omega (o) \hbox {d}\mu (\omega ), \end{aligned}

and the claim follows by taking the infimum over $$\mu \in M(\Gamma ,{\mathcal {U}}(\Gamma ))$$. $$\square$$

### Remark 2.2

An arguably more canonical way to prove Proposition 2.1 is to take the union of $$\omega$$ with an independent copy of the wired uniform spanning forest ($$\mathsf {WUSF}$$) of the Cayley graph G. Indeed, it is clear that some components of $$\mathsf {WUSF}$$ must intersect the infinite component of $$\omega$$ a.s., and it follows by indistinguishability of trees in $$\mathsf {WUSF}$$ [20] that every tree intersects the infinite component of $$\omega$$ a.s., so that the union of $$\mathsf {WUSF}$$ with $$\omega$$ is a.s. connected. (It should also be possible to argue that this union is connected more directly, using Wilson’s algorithm [7, 41].) The result then follows since $$\mathsf {WUSF}$$ has expected degree 2 in any transitive graph [7, Theorem 6.4].

This alternative construction may be of interest for the following reason: It is well known [32, Question 10.12] that an affirmative answer to Question 1.1 would follow if one could construct for every $$\varepsilon >0$$ an invariant coupling $$(\mathsf {FUSF},\eta )$$ of the free uniform spanning forest of a Cayley graph of $$\Gamma$$ with a percolation process $$\eta$$ of density at most $$\varepsilon$$ such that $$\mathsf {FUSF}\cup \eta \in S(\Gamma )$$ almost surely. Since Kazhdan groups have $$\beta _1=0$$, their free and wired uniform spanning forests always coincide [32, Section 10.2], so that proving Theorem 1.2 via this alternative proof of Proposition 2.1 can be seen as a realization of this possibly general strategy.

### 2.2 A construction

We now construct an invariant measure $$\mu \in M(\Gamma ,{\mathcal {U}}(\Gamma ))$$ with arbitrarily small expected degree. We will work on an arbitrary Cayley graph of the Kazhdan group $$\Gamma$$, and the measure we construct will be concentrated on subgraphs of this Cayley graph. (Recall from the introduction that countable Kazhdan groups are always finitely generated.)

Let $$G=(V,E)$$ be a connected, locally finite graph. For each $$\omega \in \{0,1\}^V$$, the clusters of $$\omega$$ are defined to be the vertex sets of the connected components of the subgraph of G induced by the vertex set $$\{v\in V: \omega (v)=1\}$$ (that is, the subgraph of G with vertex set $$\{v\in V: \omega (v)=1\}$$ and containing every edge of G both of whose endpoints belong to this set). Fix $$p\in (0,1)$$, and let $$\mu _1$$ be the law of Bernoulli-p site percolation on G. For each $$i\ge 1$$, we recursively define $$\mu _{i+1}$$ to be the law of the random configuration $$\omega \in \{0,1\}^V$$ obtained as follows:

1. 1.

Let $$\omega _1,\omega _2\in \{0,1\}^V$$ be independent random variables each with law $$\mu _i$$.

2. 2.

Let $$\eta _1$$ and $$\eta _2$$ be obtained from $$\omega _1$$ and $$\omega _2$$ respectively by choosing to either delete or retain each cluster independently at random with retention probability

\begin{aligned} q(p) := \frac{1-\sqrt{1-p}}{p} \in \left( \frac{1}{2},1\right) . \end{aligned}
3. 3.

Let $$\omega$$ be the union of the configurations $$\eta _1$$ and $$\eta _2$$.

It follows by induction that if G is a Cayley graph of a finitely generated group $$\Gamma$$ then $$\mu _i \in M(\Gamma ,\Omega )$$ for every $$i\ge 1$$. More generally, for each measure $$\mu$$ on $$\{0,1\}^V$$ and $$q\in [0,1]$$ we write $$\mu ^q$$ for the q-thinned measure, which is the law of the random variable $$\eta$$ obtained by taking a random variable $$\omega$$ with law $$\mu$$ and choosing to either delete or retain each cluster of $$\omega$$ independently at random with retention probability q. (See [33, Section 6] for a more formal construction of this measure.)

We write $$\delta _V$$ and $$\delta _\emptyset$$ for the probability measures on $$\{0,1\}^V$$ giving all their mass to the all 1 and all 0 configurations respectively.

### Proposition 2.3

Let $$G=(V,E)$$ be a connected, locally finite graph, let $$p\in (0,1)$$ and let $$(\mu _i)_{i\ge 1}$$ be as above. Then $$\mu _i(\{\omega : \omega (u) =1\})=p$$ for every $$i\ge 1$$ and $$u\in V$$ and $$\mu _i$$$$\hbox {weak}^*$$ converges to the measure $$p \delta _V + (1-p) \delta _\emptyset$$ as $$i\rightarrow \infty$$.

### Proof

It suffices to prove that for every pair of adjacent vertices $$u,v \in V$$ we have that

\begin{aligned} \mu _i(\{\omega : \omega (u) =1\})=p \quad \hbox { for every }\ i\ge 1 \quad \text { and } \quad \lim _{i\rightarrow \infty }\mu _i\bigl (\{\omega : \omega (u)=\omega (v)\}\bigr ) = 1. \end{aligned}

For each $$u,v\in V$$ and $$i\ge 1$$ let $$p_i(u) = \mu _i(\{\omega : \omega (u) =1\})$$ and let $$\sigma _i(u,v) = \mu _i(\{\omega : \omega (u) = \omega (v)=1\})$$. Note that $$p_1(u)=p$$ for every $$u\in V$$, that $$\sigma _1(u,v) = p^2 >0$$ for every $$u,v\in V$$, and that $$\sigma _i(u,v) \le p_i(u)$$ for every $$u,v\in V$$ and $$i\ge 1$$. Write $$q=q(p)$$. For each $$i\ge 1$$ and $$u\in V$$, it follows by definition of $$\mu _{i+1}$$ that

\begin{aligned} p_{i+1}(u) =(1-(1-q)^2) \, p_i(u)^2 + 2q \, p_i(u) \, (1-p_i(u)) = \phi \big ( p_i(u) \big ),\nonumber \\ \end{aligned}
(2.1)

where $$\phi :\mathbb {R}\longrightarrow \mathbb {R}$$ is the polynomial

\begin{aligned} \phi (x) := (2q-q^2)x^2+2qx(1-x) = 2qx-q^2 x^2. \end{aligned}

It follows by elementary analysis that $$\phi$$ is strictly increasing and concave on (0, p), with $$\phi (0)=0$$ and $$\phi (p)=p$$. Thus, we deduce by induction that $$p_i(u)=p$$ for every $$i\ge 1$$ and $$u\in V$$ as claimed. Similarly, for each $$i \ge 1$$ and adjacent $$u,v \in V$$ we have by definition of $$\mu _{i+1}$$ that

\begin{aligned} \sigma _{i+1}(u,v)&= (1-(1-q)^2) \, \mu _i\bigl (\omega (u) = \omega (v)=1)^2 \\&\quad + 2q \, \mu _i\bigl (\omega (u) = \omega (v) = 1\bigr ) \, (1-\mu _i\bigl (\omega (u) = \omega (v)=1))\\&\quad +2q^2\mu _i\bigl (\omega (u) =1, \omega (v)=0\bigr ) \, \mu _i\bigl (\omega (u) =0, \omega (v)=1\bigr ) \\&= \phi (\sigma _i(u,v))+ 2q^2\mu _i\bigl (\omega (u) =1, \omega (v)=0\bigr ) \, \mu _i\bigl (\omega (u) =0, \omega (v)=1\bigr )\\&\ge \phi (\sigma _i(u,v)), \end{aligned}

where we have used that fact that if $$\omega (u)=\omega (v)=1$$ then u and v are in the same cluster of $$\omega$$. Since $$\phi$$ is strictly increasing and concave on (0, p), with the only fixed points 0 and p, and since $$\sigma _1(u,v)>0$$, it follows that $$\sigma _i(u,v) \uparrow p$$ as $$i\rightarrow \infty$$. The claim now follows since

\begin{aligned} \mu _i(\omega (u) \ne \omega (v))= & {} \mu _i(\omega (u) =1, \omega (v) =0) + \mu _i(\omega (u) =0, \omega (v) =1) \\= & {} 2p\left( 1-\frac{\sigma _i(u,v)}{p}\right) , \end{aligned}

which tends to zero as $$i\rightarrow \infty$$. $$\square$$

See Figs. 1 and 2 for simulations of the measures $$\mu _i$$ on $$\mathbb {Z}^2$$ and $$\mathbb {Z}^3$$.

### 2.3 Ergodicity and condensation

On Cayley graphs of infinite Kazhdan groups, Proposition 2.3 will be useful only if we also know something about the ergodicity of the measures $$\mu _i$$. To this end, we will apply some tools introduced by Lyons and Schramm [33] that give sufficient conditions for ergodicity of q-thinned processes. The first such lemma, which is proven in [33, Lemma 4.2] and is based on an argument of Burton and Keane [8], shows that every cluster of an invariant percolation process has an invariantly-defined frequency as measured by an independent random walk. Moreover, conditional on the percolation configuration, the frequency of each cluster is non-random and does not depend on the starting point of the random walk.

### Lemma 2.4

(Cluster frequencies) Let $$G=(V,E)$$ be a Cayley graph of an infinite, finitely generated group $$\Gamma$$. There exists a Borel measurable, $$\Gamma$$-invariant function $${\text {freq}}:\{0,1\}^V \rightarrow [0,1]$$ with the following property. Let $$\mu \in M(\Gamma ,\Omega )$$ be an invariant site percolation, and let $$\omega$$ be a random variable with law $$\mu$$. Let v be a vertex of G and let $$\mathbb {P}_v$$ be the law of simple random walk $$\{X_n\}_{n\ge 0}$$ on G started at v. Then

\begin{aligned} \lim _{N\rightarrow \infty } \frac{1}{N} \sum _{n=0}^{N-1} {1}_{\{X_n \in C\}} = {\text {freq}}(C) \qquad \hbox { for every cluster } C \hbox { of } \omega \end{aligned}
(2.2)

$$\mu \otimes \mathbb {P}_v$$-almost surely.

This notion of frequency is used in the next proposition, which is a slight variation on [33, Lemma 6.4]. We define $$\mathscr {F}\subseteq \{0,1\}^V$$ to be the event that there exists a cluster of positive frequency. Note that the $$\Gamma$$-invariance and Borel measurability of $${\text {freq}}$$ implies that $$\mathscr {F}$$ is $$\Gamma$$-invariant and Borel measurable also.

### Proposition 2.5

(Ergodicity of the q-thinning) Let $$G=(V,E)$$ be a Cayley graph of an infinite, finitely generated group $$\Gamma$$, and let $$\mu \in E(\Gamma ,\Omega )$$ be an ergodic invariant site percolation such that $$\mu (\mathscr {F})=0$$. Then the q-thinned measure $$\mu ^q$$ is also ergodic for every $$q\in [0,1]$$. Similarly, if we have k measures $$\nu _1,\ldots ,\nu _k \in E(\Gamma ,\Omega )$$ such that $$\nu _i(\mathscr {F})=0$$ for every $$1\le i \le k$$ and $$\nu _1\otimes \dots \otimes \nu _k$$ is ergodic, then $$\nu _1^q\otimes \dots \otimes \nu _k^q$$ is also ergodic for every $$q\in [0,1]$$.

### Proof

Let $$\omega$$ be a random variable with law $$\mu \in M(\Gamma ,\Omega )$$. We first show that if $$\mu (\mathscr {F})=0$$ then

\begin{aligned} \lim _{N\rightarrow \infty } \frac{1}{N} \sum _{n=0}^{N-1} \mathbb {P}_o\Big ( B(X_0,r) \longleftrightarrow B(X_n,r) \Big ) = 0\qquad \mu \,-\,\text {a.s.}, \end{aligned}
(2.3)

for every $$r\ge 0$$, where B(vr) is the ball of radius r around $$v\in V$$, and for $$U_1,U_2\subseteq V$$, we write $$\{U_1 \longleftrightarrow U_2\}$$ for the event that there exist $$x_1\in U_1$$ and $$x_2\in U_2$$ that are in the same cluster of $$\omega$$. An easy but important implication of (2.3) is that

\begin{aligned} \inf _{x\in V} \mu \left( B(o,r) \longleftrightarrow B(x,r) \right) = 0 \end{aligned}
(2.4)

for every $$r \ge 0$$ and every $$\mu \in M(\Gamma ,\Omega )$$ such that $$\mu (\mathscr {F})=0$$. (Note that the proof of [33, Lemma 6.4] established this fact under the additional assumption that $$\mu$$ is insertion tolerant.)

Condition on $$\omega$$, and denote the finitely many clusters that intersect B(or) by $$\{C_i\}_{i=1}^m$$. Taking $$\mathbb {P}_o$$-expectations in (2.2) and using the dominated convergence theorem, Lemma 2.4 implies that

\begin{aligned}&\lim _{N\rightarrow \infty } \frac{1}{N} \sum _{n=0}^{N-1} \mathbb {P}_o\big ( B(X_0,r) \longleftrightarrow X_n \big ) \nonumber \\&\quad = \lim _{N\rightarrow \infty } \frac{1}{N} \sum _{i=1}^m \sum _{n=0}^{N-1} \mathbb {P}_o\big ( X_n \in C_i \big ) = 0 \qquad \mu -\text {a.s.} \end{aligned}
(2.5)

Now notice that

\begin{aligned} \sum _{i=0}^r \mathbb {P}_o\Big ( B(X_0,r) \longleftrightarrow X_{n+i} \;\Big |\;B(X_0,r) \longleftrightarrow B(X_n,r) \Big ) \ge \deg (o)^{-r} \end{aligned}

for every $$n,r\ge 0$$, and hence that

\begin{aligned}&\sum _{n=0}^{N-1} \mathbb {P}_o\big ( B(X_0,r) \longleftrightarrow B(X_n,r) \big ) \nonumber \\&\quad \le (r+1)\deg (o)^{r}\sum _{n=0}^{N-1+r} \mathbb {P}_o\big ( B(X_0,r) \longleftrightarrow X_n \big ) \end{aligned}
(2.6)

for every $$N\ge 1$$ and $$r\ge 0$$. Dividing by N and letting $$N\rightarrow \infty$$, this inequality and (2.5) imply (2.3).

The rest of the proof of the ergodicity of $$\mu ^q$$ is identical to the argument in [33, Lemma 6.4], which we recall here for the reader’s convenience. Suppose that $$\mu$$ is ergodic. Denote by $$\omega ^q$$ the q-thinned configuration obtained from $$\omega$$, let $$\mathbb {P}^q$$ denote the joint law of $$(\omega ,\omega ^q)$$, and let A be any invariant event for $$(\omega ,\omega ^q)$$. For every $$\varepsilon >0$$ there exists some $$r>0$$ and an event $$A_{\varepsilon ,r}$$ depending only on the restriction of $$(\omega ,\omega ^q)$$ to B(or) such that $$\mathbb {P}^q\big (A \,\triangle \, A_{\varepsilon ,r}\big ) < \varepsilon$$. By (2.4) we may take x such that $$\mu \big ( B(o,r) \longleftrightarrow B(x,r) \big )<\varepsilon$$. Conditionally on $$D_x:=\{B(o,r) \,\,\, \not \!\!\!\longleftrightarrow B(x,r)\}$$ in $$\omega$$, the coin flips for the q-thinning of the clusters intersecting B(or) and B(xr) are independent, hence

\begin{aligned} \Big | \mathbb {P}^q \big (A_{\varepsilon ,r} \cap \gamma _x A_{\varepsilon ,r} \,\big |\,\omega \big ) - \mathbb {P}^q \big (A_{\varepsilon ,r} \,\big |\,\omega \big ) \, \mathbb {P}^q \big (\gamma _x A_{\varepsilon ,r} \,\big |\,\omega \big ) \Big | \le 2 \cdot {\mathbf {1}}_{D_x}(\omega )\,, \end{aligned}

where $$\gamma _x$$ is translation by $$x \in \Gamma$$. Taking expectation with respect to $$\mu$$ then letting $$\varepsilon \rightarrow 0$$, we get that

\begin{aligned} \mathbb {E}_\mu \Big | \mathbb {P}^q(A\mid \omega ) - \mathbb {P}^q(A\mid \omega )^2 \Big | = 0\,, \end{aligned}

and hence that $$\mathbb {P}^q (A\mid \omega ) \in \{0,1\}$$$$\mu$$-almost surely. By the ergodicity of $$\mu$$, this implies that $$\mathbb {P}^q(A) \in \{0,1\}$$. It follows that $$\mathbb {P}^q$$ is ergodic and hence that $$\mu ^q$$ is ergodic also.

Similarly, if $$\nu _1\otimes \dots \otimes \nu _k$$ is ergodic and $$\nu _i(\mathscr {F})=0$$ for every $$1\le i \le k$$, then we have by (2.3) that if $$\mathbf {\omega }=(\omega _1,\ldots ,\omega _k)$$ is a random variable with law $$\mathbf {\nu }=\nu _1\otimes \dots \otimes \nu _k$$ then

\begin{aligned}&\inf _{x\in V} \nu \left( B(o,r) \leftrightarrow B(x,r) \hbox { in } \omega _i \hbox { for some } 1\le i \le k \right) \\&\quad \le \lim _{N\rightarrow \infty } \frac{1}{N} \sum _{n=0}^{N-1} \sum _{i=1}^k \nu _i \otimes \mathbb {P}_o\Big (B(X_0,r) \longleftrightarrow B(X_n,r) \Big ) =0. \end{aligned}

The ergodicity of $$\nu _1^q\otimes \dots \otimes \nu _k^q$$ then follows by a similar argument to that above. $$\square$$

Define $$i_{\mathrm {freq}}$$ to be the minimal $$i\ge 1$$ such that $$\mu _i(\mathscr {F})>0$$, letting $$i_{\mathrm {freq}}=\infty$$ if this never occurs. We want to prove, using induction and Proposition 2.5, that $$\mu _i$$ is ergodic for every $$1\le i\le i_{\mathrm {freq}}$$. However, it is not always true that the union of two independent ergodic percolation processes is ergodicFootnote 1 To circumvent this problem, we instead prove a slightly stronger statement. Recall that a measure $$\mu \in M(\Gamma ,\Omega )$$ is weakly mixing if and only if the independent product $$\mu \otimes \mu \in M(\Gamma ,\Omega ^2)$$ is ergodic when $$\Gamma$$ acts diagonally on $$\Omega ^2$$, if and only if the k-wise independent product $$\mu ^{\otimes k} \in M(\Gamma ,\Omega ^k)$$ is ergodic for every $$k\ge 2$$ [40, Theorem 1.24]. This can be taken as the definition of weak mixing for the purposes of this paper.

### Proposition 2.6

Let G be a Cayley graph of an infinite, finitely generated group $$\Gamma$$, let $$p\in (0,1)$$, and let $$(\mu _i)_{i\ge 1}$$ be as above. Then $$\mu _i$$ is weakly mixing for every $$1\le i\le i_{\mathrm {freq}}$$.

### Proof

We will prove the claim by induction on i. For $$i=1$$, $$\mu _1$$ is simply the law of Bernoulli-p percolation, which is certainly weakly mixing. Now assume that $$i<i_{\mathrm {freq}}$$ and that $$\mu _i$$ is weakly mixing, so that $$\mu _i^{\otimes 4}$$ is ergodic. Applying Proposition 2.5 we obtain that the independent 4-wise product $$(\mu ^q_i)^{\otimes 4}$$ of the q-thinned percolations is again ergodic. Since $$\mu _{i+1}^{\otimes 2}$$ can be realized as a factor of $$(\mu _i^q)^{\otimes 4}$$ by taking the unions in the first and second halves of the 4 coordinates, and since factors of ergodic processes are ergodic, it follows that $$\mu _{i+1}^{\otimes 2}$$ is ergodic and hence that $$\mu _{i+1}$$ is weakly mixing. $$\square$$

Since $$\mathscr {F}$$ is an invariant event, Proposition 2.6 has the following immediate corollary.

### Corollary 2.7

Let G be a Cayley graph of an infinite, finitely generated group $$\Gamma$$, let $$p\in (0,1)$$, and let $$(\mu _i)_{i\ge 1}$$ be as above. If $$i_{\mathrm {freq}}<\infty$$ then $$\mu _{i_{\mathrm {freq}}}(\mathscr {F})=1$$.

### Remark 2.8

It is possible to prove by induction that the measures $$\mu _i$$ are both insertion tolerant and deletion tolerant for every $$i\ge 1$$. Thus, it follows from the indistinguishability theorem of Lyons and Schramm [33], which holds for all insertion tolerant invariant percolation processes, that if $$i_{\mathrm {freq}} < \infty$$ then $$\mu _{i_{\mathrm {freq}}}$$ is supported on configurations in which there is a unique infinite cluster; see [33, Section 4]. We will not require this result.

Next, we deduce the following from Proposition 2.6.

### Corollary 2.9

(Condensation) Let G be a Cayley graph of a countably infinite Kazhdan group, let $$p\in (0,1)$$ and let $$(\mu _i)_{i\ge 1}$$ be as above. Then $$i_{\mathrm {freq}}<\infty$$.

### Proof

Suppose for contradiction that $$i_{\mathrm {freq}}=\infty$$. Then it follows by Proposition 2.6 that $$\mu _i$$ is weakly mixing and hence ergodic for every $$i\ge 1$$. But $$\mu _i$$$$\hbox {weak}^*$$ converges to the non-ergodic measure $$p\delta _V +(1-p)\delta _\emptyset$$ by Proposition 2.3, contradicting property (T). $$\square$$

### Proof of Theorem 1.2

Recall that every countable Kazhdan group is finitely generated [4, Theorem 1.3.1]. Let $$G=(V,E)$$ be a Cayley graph of $$\Gamma$$, let $$p\in (0,1)$$, and let $$(\mu _i)_{i\ge 1}$$ be as above. It follows from Corollaries 2.9 and 2.7 that $$1\le i_{\mathrm {freq}}<\infty$$ and that $$\mu _{i_{\mathrm {freq}}}$$ is supported on $$\mathscr {F}$$. Let $$\omega \in \{0,1\}^V$$ be sampled from $$\mu _{i_{\mathrm {freq}}}$$, so that $$\omega \in \mathscr {F}$$ almost surely. Fatou’s lemma implies that the total frequency of all components of $$\omega$$ is at most 1 almost surely, and consequently that $$\omega$$ has at most finitely many components of maximal frequency almost surely. Let $$\omega '$$ be obtained from $$\omega$$ by choosing one of the maximum-frequency components of $$\omega$$ uniformly at random, retaining this component, and deleting all other components of $$\omega$$, so that $$\omega '$$ has a unique infinite cluster almost surely. Let $$\eta \in \{0,1\}^{\Gamma \times \Gamma }$$ be defined by setting $$\eta (u,v)=1$$ if and only if u and v are adjacent in G and have $$\omega '(u)=\omega '(v)=1$$, and let $$\nu$$ be the law of $$\eta$$, so that $$\nu \in M(\Gamma ,{\mathcal {U}}(\Gamma ))$$. It follows by Propositions 2.1 and 2.3 that

\begin{aligned} {\text {cost}}(\Gamma )\le & {} 1 + \frac{1}{2}\int _{{\mathcal {U}}(\Gamma )} \deg _\eta (o) \hbox {d}\nu (\eta ) \le 1 + \frac{\deg (o)}{2}\int _{\Omega } \omega (o) \hbox {d}\mu _{i_{\mathrm {freq}}}(\omega )\\= & {} 1 + \frac{p \deg (o)}{2}. \end{aligned}

The claim now follows since $$p\in (0,1)$$ was arbitrary. $$\square$$

## 3 Relative property (T)

In this section we sketch an extension of our results to groups with relative property (T), a notion that was considered implicitly in the original work of Kazhdan [24] and first studied explicitly by Margulis [34]. If H is a subgroup of $$\Gamma$$, then the pair $$(\Gamma ,H)$$ is said to have relative property (T) if every unitary representation of $$\Gamma$$ on a Hilbert space that has almost-invariant vectors has a non-zero H-invariant vector; see [4, Definition 1.4.3]. For example, $$(\mathbb {Z}^2 \rtimes \mathrm {SL}_2(\mathbb {Z}),\mathbb {Z}^2)$$ has relative property (T) but $$\mathbb {Z}^2 \rtimes \mathrm {SL}_2(\mathbb {Z})$$ does not have property (T) itself [24]. Similar results with $$\mathbb {Z}$$ replaced by other rings have been proven by Kassabov [23] and Shalom [39]. See e.g. [9, 22] for further background.

The analogue of the Glasner-Weiss theorem for pairs $$(\Gamma ,H)$$ with relative property (T) is that any $$\hbox {weak}^*$$-limit of $$\Gamma$$-invariant H-ergodic probability measures on $$\Omega =\{0,1\}^\Gamma$$ is $$\Gamma$$-ergodic; this can be established using the same methods as those of [19]. Using this, our proof of Theorem 1.2 can be extended to the following situation:

### Theorem 3.1

Let H be an infinite normal subgroup of a countable group $$\Gamma$$, and assume that the pair $$(\Gamma ,H)$$ has relative property (T). Then $$\Gamma$$ has cost 1.

The fact that $$\Gamma$$ has $$\beta _1(\Gamma )=0$$ under the hypotheses of Theorem 3.1 was proven by Martin [35]. The assumption that H is infinite is clearly needed since every group has relative property (T) with respect to its one-element subgroup. It should however be possible to relax the condition of normality in various ways, for example to s-normality [37] or weak quasi-normality [10]. We do not pursue this here.

It is a theorem of Gaboriau [15, Theorem 3.4] that if $$\Gamma$$ is a countable group with an infinite, infinite-index, normal subgroupFootnote 2H with $${\text {cost}}(H)<\infty$$, then $$\Gamma$$ has cost 1. The condition $${\text {cost}}(H)<\infty$$ is very weak, and applies in particular whenever H is either finitely generated or amenable. Thus, most natural examples to which Theorem 3.1 applies are already treated either by this theorem or by Theorem 1.2 (in the case $$H=\Gamma$$). As such, the main interest of Theorem 3.1 is to demonstrate the flexibility of the proof of Theorem 1.2, and we give only a brief sketch of the proof.

### Sketch of proof

First assume that $$\Gamma$$ is finitely generated. We start with the same sequence of measures $$\{\mu _i\}_{i\ge 1}$$ on $$\Omega$$ as before, using a Cayley graph G of $$\Gamma$$ with a finite symmetric generating set S, with edges given by right multiplication by the generating elements. The left cosets gH then form a partition of the Cayley graph into isomorphic subgraphs. Moreover, if two cosets $$g_1H$$ and $$g_2H$$ are neighbours in the sense that $$g_1n_1s=g_2n_2$$ for some $$n_i\in H$$ and $$s\in S$$, then for every $$n\in H$$ we have that

\begin{aligned} g_1ns=g_1n n_1^{-1}g_1^{-1}g_2n_2=n' g_2 n_2=g_2 n'' \end{aligned}

for some $$n',n''\in H$$, because H is normal. Thus, neighbouring cosets are connected in G by infinitely many edges (because H is infinite).

We will have to measure cluster frequencies inside individual H-cosets, and will therefore use a random walk whose jump distribution generates H. Specifically, we enumerate the elements of H as $$\{h_1,h_2,\ldots \}$$, let $$(Z_i)_{i\ge 1}$$ be an i.i.d. sequence of H-valued random variables with $$\mathbb {P}(Z_i=h_k) = 2^{-k}$$, and write $$\mathbb {P}^{X_0}$$ for the law of the random walk $$(X_n)_{n\ge 0}$$ defined by $$X_{n+1} = X_n Z_{n+1}$$ for each $$n\ge 0$$, where $$X_0$$ is an arbitrary element of $$\Gamma$$. An analogue of Lemma 2.4 is that for every $$r\in \mathbb {N}$$, and every left H-coset gH, there exists an H-invariant cluster frequency function $${\text {freq}}_{gH,r}$$ such that if $$\mu \in M(\Gamma ,\Omega )$$ is a $$\Gamma$$-invariant percolation process and $$\omega$$ is a random variable with law $$\mu$$ then

\begin{aligned} \lim _{N\rightarrow \infty } \frac{1}{N} \sum _{n=0}^{N-1} {1}_{\big \{B(X_n,r) \cap C \not = \emptyset \big \}} = {\text {freq}}_{gH,r}(C) \qquad \hbox { for every cluster } C \hbox { of } \omega \end{aligned}

$$\mu \otimes \mathbb {P}^{X_0}$$ almost surely for every $$X_0 \in g H$$. The argument of Proposition 2.5 then implies that, if all cluster frequencies $${\text {freq}}_{gH,r}(C)$$ for every $$r\in \mathbb {N}$$ are almost surely zero in an H-invariant H-ergodic percolation measure $$\mu$$, then $$\mu ^q\otimes \dots \otimes \mu ^q$$ is H-ergodic. The reason we need the zero frequencies for all r-balls instead of just $$r=0$$ is that (2.6) does not necessarily hold now, since the random walk is confined to the H-coset, while percolation clusters are not.

Now, the analogue of Corollary 2.9 is that if $$(\Gamma ,H)$$ has relative property (T), then there exists $$i_{\mathrm {freq}}<\infty$$ and $$r \in \mathbb {N}$$ such that, if $$\omega \in \{0,1\}^\Gamma$$ is a random variable with law $$\mu _{i_\mathrm {freq}}$$, then for every left H-coset gH there almost surely exists a cluster $$C_{gH}$$ with $${\text {freq}}_{gH,r}(C_{gH})>0$$. For each coset gH, let $$\eta _{gH}$$ be a cluster chosen uniformly at random from among those maximizing $${\text {freq}}_{gH,r}$$. Now we can apply sprinkling: for any $$\varepsilon >0$$, adding an independent Bernoulli$$(\varepsilon )$$ bond percolation will almost surely connect the infinite clusters $$\eta _{gH}$$ in neighbouring H-cosets, and by deleting all clusters of the resulting percolation configuration other than the unique cluster containing $$\bigcup \eta _{gH}$$, we obtain a $$\Gamma$$-invariant percolation process of average degree $$O(p+\varepsilon )$$ that has a unique infinite cluster. The fact that this sprinkling achieves the desired effect follows by a standard argument in invariant percolation (see e.g. the proof of [33, Theorem 6.12]), sketched as follows:

1. 1.

Let e be the identity element of $$\Gamma$$. For each $$\delta >0$$ there exists R such that the cluster $$\eta _{H}$$ intersects the ball B(eR) with probability at least $$1-\delta$$. Thus, for each $$u \in H$$ and $$s\in S$$ the clusters $$\eta _{sH}$$ and $$\eta _{H}$$ both intersect the ball $$B(u,R+1)$$ with probability at least $$1-2\delta$$. Thus, if $$u_1,u_2\ldots$$ is an enumeration of H then the clusters $$\eta _{sH}$$ and $$\eta _{H}$$ both intersect the ball $$B(u_i,R+1)$$ for infinitely many i with probability at least $$1-2\delta$$ by Fatou’s lemma. On this event, it is immediate that the $$\varepsilon$$-sprinkling connects the clusters $$\eta _{sH}$$ and $$\eta _{H}$$ almost surely. We deduce that the $$\varepsilon$$-sprinkling connects the clusters $$\eta _{sH}$$ and $$\eta _{H}$$ with probability at least $$1-2\delta$$, and hence with probability 1 since $$\delta >0$$ was arbitrary.

2. 2.

Any two cosets have a finite chain of neighbouring coset pairs connecting them, hence sprinkling gives a unique infinite cluster that contains $$\bigcup \eta _{gH}$$.

Since p and $$\varepsilon$$ can be made arbitrarily small, Proposition 2.1 applies, and $$\Gamma$$ must have cost 1.

We can now remove the assumption that $$\Gamma$$ be finitely generated, as pointed out to us by Damien Gaboriau. First, the standard proof that Kazhdan groups are finitely generated [4, Theorem 1.3.1] gives for relative property (T) that the subgroup H is contained in a finitely generated subgroup $$\Gamma '$$ of $$\Gamma$$ such that the pair $$(\Gamma ',H)$$ has relative property (T) [22, Theorems 2.2.1 and 2.2.3]. Our above proof gives that $$\Gamma '$$ has cost 1. Thus, for any $$\varepsilon >0$$, we can independently take a $$\Gamma '$$-invariant random graph spanning $$g\Gamma '$$ with expected degree at most $$2+\varepsilon$$ in each left coset $$g\Gamma '$$ of $$\Gamma$$. The resulting bond percolation $$\omega _\varepsilon$$ is $$\Gamma$$-invariant. (This is the probabilistic interpretation of lifting the $$\Gamma '$$-action to a $$\Gamma$$-action by co-induction, as defined in [16, Section 3.4] or [26, Section 10.(G)].) Let $$\{\gamma _i : i \ge 1\}$$ be an enumeration of $$\Gamma$$, and consider the random subset $$\eta _\varepsilon \subseteq \Gamma \times \Gamma$$ in which each $$(g,g\gamma _i)$$ is included independently at random with probability $$\varepsilon 2^{-i}$$. Let $${\bar{\eta }}_\varepsilon = \eta _\varepsilon \cup \{(g_1,g_2) : (g_2,g_1) \in \eta _\varepsilon \}$$ be obtained from $$\eta _\varepsilon$$ by symmetrization, so that $${\bar{\eta }}_\varepsilon$$ is a $$\Gamma$$-invariant random graph on $$\Gamma$$ with expected degree at most $$2\varepsilon$$.

Consider the independent union of $$\omega _\varepsilon$$ and $${\bar{\eta }}_\varepsilon$$, which has expected degree at most $$2+3\varepsilon$$. Since H is an infinite normal subgroup of $$\Gamma$$ and each left H-coset gH is contained in a single connected component of $$\omega _\varepsilon$$, a similar argument to above shows that $${\bar{\eta }}_\varepsilon$$ almost surely connects each pair of components of $$\omega _\varepsilon$$, so that the union of $$\omega _\varepsilon$$ and $${\bar{\eta }}_\varepsilon$$ is connected almost surely. Since $$\varepsilon$$ was arbitrary, $$\Gamma$$ has cost 1. $$\square$$

## 4 Closing remarks

### Remark 4.1

It would be interesting to investigate the behaviour of the processes we construct in Sect. 2.2 on other classes of Cayley graphs. Simulations suggest, perhaps surprisingly, that the process has very different behaviours on $$\mathbb {Z}^2$$ and $$\mathbb {Z}^3$$: It seems that in two dimensions, when $$p>0$$ is small, $$\mu _i$$ is supported on configurations with no infinite clusters for every $$i\ge 1$$, while in three dimensions there is a unique infinite cluster after finitely many iterations. See Figs. 1 and 2. Understanding the reason for this disparity may lead to proofs of cost 1 for other classes of groups.

### Remark 4.2

Instead of relying on Proposition 2.5 and working with cluster frequencies directly, one could instead write down a proof of the insertion tolerance of our measures $$\mu _i$$ (which is true though not completely immediate), then use [33, Theorem 4.1 and Lemma 6.4] of Lyons and Schramm almost as a black box. See also Remark 2.8.

### Remark 4.3

Reflecting on the proof of Theorem 1.2 may suggest that we do not use the full power of property (T), but rather the apparently weaker property that any $$\hbox {weak}^*$$ limit of weakly-mixing measures in $$M(\Gamma ,\Omega )$$ is ergodic. However, it is a result of Kechris [26, Theorem 12.8] that this property is equivalent to property (T), see also [28].

### Remark 4.4

Our proof strategy seems to break down if one wanted to prove that every infinite Kazhdan group has fixed price 1, or equivalently that $${\text {cost}}^*(\Gamma )=1$$ as defined in (1.2).

Section 2.1, the reduction part, continues to work in the FIID setting: Indeed, if one can construct a FIID process in $$M(\Gamma ,{\mathcal {U}}(\Gamma ))$$ with expected degree at most $$\varepsilon$$, then either proof of Proposition 2.1 will yield a process in $$F_{\mathrm {IID}}(\Gamma ,{\mathcal {S}}(\Gamma ))$$ with expected degree at most $$2+\varepsilon$$. (The fact that the WUSF is a FIID can be deduced from the ‘stack of arrows’ implementation of Wilson’s algorithm and its interpretation in terms of cycle-popping [7, 41].)

On the other hand, it seems unlikely that the thinning procedure in the construction of Sect. 2.2 can be carried out using FIID processes. Indeed, as explained by Klaus Schmidt in the proof of Theorem 2.4 of [38], it was implicitly proved by Losert and Rindler [30] that the Markov operator for any generating set of a nonamenable group $$\Gamma$$ acting on $$L^2([0,1]^\Gamma ,\mathrm {Leb}^{\otimes \Gamma })$$ has a spectral gap, and hence that the Bernoulli shift is strongly ergodic. See [25, Section 3] and [3, Theorem 3.1] for related results. This spectral gap implies that the agreement probability for some pair of neighbours is separated away from 1 in any FIID site percolation of fixed density $$p \in (0,1)$$, and this bound is clearly inherited by $$\hbox {weak}^*$$ limits. (More generally, it is a theorem of Abért and Weiss [2, Theorem 4] that any $$\hbox {weak}^*$$ limit of factors of a strongly ergodic process is ergodic.) Thus, by Proposition 2.3, on any nonamenable Cayley graph there exists $$i_{\mathrm {fiid}}<\infty$$ such that $$\mu _i$$ is not FIID for $$i > i_{\mathrm {fiid}}$$. There seems to be no reason to expect that $$i_{\mathrm {fiid}} =i_{\mathrm {freq}}$$ in the Kazhdan case, which would be needed to prove $${\text {cost}}^*(\Gamma )=1$$ via this strategy.

### Remark 4.5

It is perhaps better to think of the proof of Theorem 1.2 as a proof of the contrapositive of that theorem, i.e., as a proof that every countable group with cost $$>1$$ does not have property (T). Indeed, if $$\Gamma$$ is finitely generated with $${\text {cost}}(\Gamma )>1$$, then running our iterations with $$p>0$$ small enough we can never arrive at a unique infinite cluster, and hence we obtain an explicit sequence $$\mu _i \in E(\Gamma ,\Omega )$$ converging to the non-ergodic measure $$p \delta _\Gamma + (1-p)\delta _\emptyset$$.