Threshold for the expected measure of random polytopes

Let $\mu$ be a log-concave probability measure on ${\mathbb R}^n$ and for any $N>n$ consider the random polytope $K_N={\rm conv}\{X_1,\ldots ,X_N\}$, where $X_1,X_2,\ldots $ are independent random points in ${\mathbb R}^n$ distributed according to $\mu $. We study the question if there exists a threshold for the expected measure of $K_N$. Our approach is based on the Cramer transform $\Lambda_{\mu}^{\ast }$ of $\mu $. We examine the existence of moments of all orders for $\Lambda_{\mu}^{\ast }$ and establish, under some conditions, a sharp threshold for the expectation ${\mathbb E}_{\mu^N}[\mu (K_N)]$ of the measure of $K_N$: it is close to $0$ if $\ln N\ll {\mathbb E}_{\mu }(\Lambda_{\mu}^{\ast })$ and close to $1$ if $\ln N\gg {\mathbb E}_{\mu }(\Lambda_{\mu}^{\ast })$. The main condition is that the parameter $\beta(\mu)={\rm Var}_{\mu }(\Lambda_{\mu}^{\ast })/({\mathbb E}_{\mu }(\Lambda_{\mu }^{\ast }))^2$ should be small.


Introduction
We study the question how to obtain a threshold for the expected measure of a random polytope defined as the convex hull of independent random points with a log-concave distribution.The general formulation of the problem is the following.Given a log-concave probability measure µ on R n , let X 1 , X 2 , . . .be independent random points in R n distributed according to µ and for any N > n define the random polytope Then, consider the expectation E µ N [µ(K N )] of the measure of K N , where µ N = µ ⊗ • • • ⊗ µ (N times).This is an affinely invariant quantity, so we may assume that µ is centered, i.e. the barycenter of µ is at the origin.
Given δ ∈ (0, 1) we say that µ satisfies a "δ-upper threshold" with constant and that µ satisfies a "δ-lower threshold" with constant Then, we define ̺ 1 (µ, δ) = sup{̺ 1 : (1.1) holds true} and ̺ 2 (µ, δ) = inf{̺ 2 : (1.2) holds true}.Our main goal is to obtain upper bounds for the difference for any fixed δ ∈ 0, 1 2 .One may also consider a sequence {µ n } ∞ n=1 of log-concave probability measures µ n on R n .Then, we say that {µ n } ∞ n=1 exhibits a "sharp threshold" if there exists a sequence {δ n } ∞ n=1 of positive reals such that δ n → 0 and ̺(µ n , δ n ) → 0 as n → ∞.This terminology may be used to describe a variety of results that have been obtained for specific sequences of measures (in most cases, product measures or rotationally invariant measures).In Section 2 we provide a non-exhaustive list of contributions to this topic; starting with the classical work [14] of Dyer, Füredi and McDiarmid, which concerns the uniform measure on the cube, most of them establish a sharp threshold.
Our aim is to describe a general approach to the problem, working with an arbitrary log-concave probability measure µ on R n .Our approach is based on the Cramer transform of µ.Recall that the logarithmic Laplace transform of µ is defined by Λ µ (ξ) = ln R n e ξ,z dµ(z) , ξ ∈ R n and the Cramer transform of µ is the Legendre transform of Λ µ , defined by For every t > 0 we set B t (µ) := {x ∈ R n : Λ * µ (x) t} and for any x ∈ R n we denote by H(x) the set of all half-spaces H of R n containing x. Then we consider the function ϕ µ , called Tukey's half-space depth, defined by We refer the reader to the survey article of Nagy, Schütt and Werner [23] for an extensive and comprehensive survey on Tukey's half-space depth, with an emphasis on its connections with convex geometry, and many references.From the definition of Λ * µ one can easily check that for every x ∈ R n we have ϕ µ (x) exp(−Λ * µ (x)) (see Lemma 3.1 in Section 3 below).In particular, for any t > 0 and for all x / ∈ B t (µ) we have that ϕ µ (x) exp(−t).A main idea, which appears in all the previous works on this topic, is to show that ϕ µ is almost constant on the boundary ∂(B t (µ)) of B t (µ).Our first main result shows that this is true, in general, if µ = µ K is the uniform measure on a centered convex body of volume 1 in R n .Theorem 1.1.Let K be a centered convex body of volume 1 in R n .Then, for every t > 0 we have that This implies that ω µK (x) − 5 √ n Λ * (x) ω µK (x) for every x ∈ R n , where ω µK (x) = ln Theorem 1.1 may be viewed as a version of Cramér's theorem (see [13]) for random vectors uniformly distributed in convex bodies.We present the proof in Section 3. It exploits techniques from the theory of large deviations and a theorem of Nguyen [25] (proved independently by Wang [31]; see also [16]) which is exactly the ingredient that forces us to consider only uniform measures on convex bodies.It seems harder to prove, if true, an analogous estimate for any centered log-concave probability measure µ on R n ; this is a basic question that our work leaves open.
The second step in our approach is to consider, for any centered log-concave probability measure µ on R n , the parameter Roughly speaking, the plan is the following: provided that ϕ µ is "almost constant" on ∂(B t (µ)) for all t > 0 and that β(µ) = o n (1), one can establish a "sharp threshold" for the expected measure of K N with We make these ideas more precise in Section 5 where we also illustrate them with a number of examples.Note that it is not clear in advance that Λ * µ has bounded second or higher order moments, which is necessary so that β(µ) would be well-defined.We study this question in Section 4 where we obtain an affirmative answer in the case of the uniform measure on a convex body.In fact we cover the more general case of κ-concave probability measures, κ ∈ (0, 1/n], which are supported on a centered convex body.Theorem 1.2.Let K be a centered convex body of volume 1 in R n .Let κ ∈ (0, 1/n] and let µ be a centered κ-concave probability measure with supp(µ) = K.Then, In particular, for all p 1 we have that E µ (Λ * µ (x)) p < ∞.The method of proof of Theorem 1.2 gives in fact reasonable upper bounds for Λ * µ L p (µ) .In particular, if we assume that µ = µ K is the uniform measure on a centered convex body then we obtain a sharp two sided estimate for the most interesting case where p = 1 or 2.
where L µK is the isotropic constant of the uniform measure µ K on K and c 1 , c 2 > 0 are absolute constants.
The left-hand side inequality of Theorem 1.3 follows easily from one of the main results in [9].Both the lower and the upper bound are of optimal order with respect to the dimension.This can be seen e.g. from the example of the uniform measure on the cube or the Euclidean ball (see Section 5), respectively.
Besides Theorem 1.2, we show in Section 4 that Λ * µ has finite moments of all orders in the following cases: (i) If µ is a centered probability measure on R which is absolutely continuous with respect to Lebesgue measure or a product of such measures.
(ii) If µ is a centered log-concave probability measure on R n and there exists a function g with lim t→∞ g(t)/ ln(t + 1) = +∞ such that Z + t (µ) ⊇ g(t)Z + 2 (µ) for all t 2, where {Z + t (µ)} t 1 is the family of one-sided L t -centroid bodies of µ.
Again, it seems harder to prove, if true, an analogous result for any centered log-concave probability measure µ on R n ; this is a second basic question that our work leaves open.
In Section 5 we describe the approach to the main problem and show how one can use the previous results to obtain bounds for ̺(µ, δ).We also clarify the role of the parameter β(µ).One would hope that β(µ) is small as the dimension increases, e.g.β(µ) c/ √ n.If so, then the next general result provides satisfactory lower bounds for ̺ 1 (µ, δ).
Theorem 1.4.Let µ be a centered log-concave probability measure on R n .Assume that β(µ) < 1/8 and 8β(µ We are able to give satisfactory upper bounds for ̺ 2 (µ, δ) in the case where µ = µ K is the uniform measure on a centered convex body K of volume 1 in R n .Theorem 1.5.Let K be a centered convex body of volume Combining these two results we see that, provided that β(µ K ) is small compared to a fixed δ ∈ (0, 1), we have a threshold of the order The above discussion leaves open a third basic question: to estimate We illustrate the method that we develop in this work with a number of examples.We consider first the standard examples of the uniform measure on the unit cube and the Gaussian measure.As a direct consequence of our results, in both cases we obtain a bound ̺(µ, δ) c(δ)/ √ n for the threshold, where c(δ) > 0 is a constant depending on δ.Finally, we examine the case of the uniform measure on the Euclidean ball D n of volume 1 in R n and obtain the following sharp threshold.
Theorem 1.6.Let D n be the centered Euclidean ball of volume 1 in R n .Then, the sequence µ n := µ Dn exhibits a sharp threshold with ̺(µ n , δ) c √ δn and e.g. in the case where n is even we have that 2 Notation, background information and related literature In this section we introduce notation and terminology that we use throughout this work, and provide background information on convex bodies and log-concave probability measures.We write •, • for the standard inner product in R n and denote the Euclidean norm by | • |.In what follows, B n 2 is the Euclidean unit ball, S n−1 is the unit sphere, and σ is the rotationally invariant probability measure on S n−1 .Lebesgue measure in R n is also denoted by | • |.The letters c, c ′ , c j , c ′ j etc. denote absolute positive constants whose value may change from line to line.Whenever we write a ≈ b, we mean that there exist absolute constants c 1 , c 2 > 0 such that c 1 a b c 2 a.
We refer to Schneider's book [29] for basic facts from the Brunn-Minkowski theory and to the book [2] for basic facts from asymptotic convex geometry.We also refer to [10] for more information on isotropic convex bodies and log-concave probability measures.

Log-concave probability measures
A convex body in R n is a compact convex set K ⊂ R n with non-empty interior.We say that K is centrally symmetric if −K = K and that K is centered if the barycenter bar(K) = 1 |K| K x dx of K is at the origin.The Minkowski functional • K of a convex body K in R n with 0 ∈ int(K) is defined for all x ∈ R n by x K = inf{s 0 : x ∈ sK} and the support function of K is the function ).In particular, f has finite moments of all orders.It is known (see [6]) that if a probability measure µ is log-concave and µ(H) < 1 for every hyperplane H in R n , then µ has a log-concave density f µ .We say that µ is even if µ(−B) = µ(B) for every Borel subset B of R n and that µ is centered if bar(µ) := x, ξ f µ (x)dx = 0 for all ξ ∈ S n−1 .We shall use the fact that if µ is a centered log-concave probability measure on R k then (2.1) This is a result of Fradelizi from [15].Note that if K is a convex body in R n then the Brunn-Minkowski inequality implies that the indicator function 1 K of K is the density of a log-concave measure, the Lebesgue measure on K.
Note that if µ is κ-concave and for all x, y ∈ R n with f (x)f (y) > 0 and all λ ∈ (0, 1).Again, we define the cases γ = 0, +∞ appropriately.Borell [7] studied the relation between κ-concave probability measures and γ-concave functions and showed that if µ is a measure on R n and the affine subspace F spanned by the support supp(µ) of µ has dimension dim(F ) = n then for every −∞ κ < 1/n we have that µ is κ-concave if and only if it has a non-negative density ψ ∈ L 1 loc (R n , dx) and ψ is γ-concave, where γ = κ 1−κn ∈ [−1/n, +∞).Let µ and ν be two log-concave probability measures on R n .Let T : R n → R n be a measurable function which is defined ν-almost everywhere and satisfies for every Borel subset B of R n .We then say that T pushes forward ν to µ and write T * ν = µ.It is easy to see that T * ν = µ if and only if for every bounded Borel measurable function g : R n → R we have If µ is a log-concave measure on R n with density f µ , we define the isotropic constant of µ by where Cov(µ) is the covariance matrix of µ with entries We say that a log-concave probability measure µ on R n is isotropic if it is centered and Cov(µ) = I n , where I n is the identity n × n matrix.In this case, For every µ there exists an affine transformation T such that T * µ is isotropic.The hyperplane conjecture asks if there exists an absolute constant C > 0 such that L n := max{L µ : µ is an isotropic log-concave probability measure on R n } C for all n 1. Bourgain [8] established the upper bound L n c 4 √ n ln n; later, Klartag, in [21], improved this estimate to L n c 4 √ n.In a breakthrough work, Chen [12] proved that for any ε > 0 there exists n 0 (ε) ∈ N such that L n n ε for every n n 0 (ε).Very recently, Klartag and Lehec [22] showed that the hyperplane conjecture and the stronger Kannan-Lovász-Simonovits isoperimetric conjecture hold true up to a factor that is polylogarithmic in the dimension; more precisely, they achieved the bound L n c(ln n) 4 , where c > 0 is an absolute constant.

Known results
Several variants of the threshold problem have been studied, starting with the work of Dyer, Füredi and McDiarmid who established in [14] a sharp threshold for the expected volume of random polytopes with vertices uniformly distributed in the discrete cube A similar result holds true for the expected volume of random polytopes with vertices uniformly distributed in the cube B n ∞ ; the corresponding value of the constant κ is κ = 2π/e γ+1/2 , where γ is Euler's constant.In the terminology of the introduction, this last result is equivalent to for any fixed value of δ ∈ 0, 1  2 .Further sharp thresholds for the volume of various classes of random polytopes have been given.In [18] a threshold for E µ N |K N |/(2α) n was established for the case where X i have independent identically distributed coordinates supported on a bounded interval [−α, α] under some mild additional assumptions.The articles [27] and [3], [4] address the same question for a number of cases where X i have rotationally invariant densities.Exponential in the dimension upper and lower thresholds are obtained in [17] for the case where X i are uniformly distributed in a simplex.
Upper and lower thresholds were obtained recently by Chakraborti, Tkocz and Vritsiou in [11] for some general families of distributions.If µ is an even log-concave probability measure supported on a convex body K in R n and if X 1 , X 2 , . . .are independent random points distributed according to µ, then for any where c 1 , c 2 > 0 are absolute constants.A lower threshold is also established in [11] for the case where µ is an even κ-concave measure on R n with 0 < κ < 1/n, supported on a convex body K in R n .If X 1 , X 2 , . . .are independent random points in R n distributed according to µ and K N = conv{X 1 , . . ., X N } as before, then for any M C and any N exp 1 κ (log n + 2 log M ) we have that where C > 0 is an absolute constant.
Analogues of these results in the setting of the present work were obtained in [9] for 0-concave, i.e. log-concave, probability measures.There exists an absolute constant c > 0 such that if where the first supremum is over all log-concave probability measures µ on R n .On the other hand, if δ ∈ (0, 1) then, as n → ∞, where the first infimum is over all log-concave probability measures µ on R n and C > 0 is an absolute constant.
It should be noted that an exponential in the dimension lower threshold is not possible in full generality.For example, in the case where X i are uniformly distributed in the Euclidean ball one needs N exp(cn ln n) points so that the volume of a random K N will be significantly large.Thus, apart from the constants depending on δ, the lower threshold above is sharp.However, it provides a weak threshold in the sense that we estimate the expectation E µ N µ(1 + δ)K N ) (for an arbitrarily small but positive value of δ) while we would like to have a similar result for E µ N µ(K N )].It is shown in [9] that there exists an absolute constant C > 0 such that inf as n → ∞, where the first infimum is over all log-concave probability measures µ on R n and u(n) is any function with u(n) → ∞ as n → ∞.
3 Estimates for the half-space depth Let µ be a centered log-concave probability measure on R n with density f := f µ .Recall that the logarithmic Laplace transform of µ on R n is defined by It is easily checked by means of Hölder's inequality that Λ µ is convex and Λ µ (0) = 0. Since bar(µ) = 0, Jensen's inequality shows that for all ξ.Therefore, Λ µ is a non-negative function.One can check that the set In other words, Λ * µ is the Legendre transform of Λ µ : recall that given a convex function g : R n → (−∞, ∞], the Legendre transform L(g) of g is defined by

The function Λ *
µ is called the Cramer transform of µ.For every t > 0 we define the convex set For any x ∈ R n we denote by H(x) the set of all half-spaces H of R n containing x. Then we define The function ϕ µ is called Tukey's half-space depth.Our aim is to give upper and lower bounds for ϕ µ (x) when x ∈ ∂(B t (µ)), t > 0.
Proof.Let x ∈ R n .We start with the observation that for any ξ ∈ R n the half-space {z and taking the infimum over all ξ ∈ R n we see that ϕ µ (x) exp(−Λ * µ (x)), as claimed.
Next, we would like to obtain a lower bound for ϕ µ (x) when x ∈ B t (µ).In the case where µ = µ K is the uniform measure on a centered convex body K of volume 1 in R n , our estimate is the following.Theorem 3.2.Let K be a centered convex body of volume 1 in R n .Then, for every t > 0 we have that The first part of the argument works for any centered log-concave probability measure µ with density f on R n .For every ξ ∈ R n we define the probability measure µ ξ with density In the next lemma (see [10, Proposition 7.2.1])we recall some basic facts for µ ξ .
Next, we set Let t > 0. Since B t (µ) is convex, in order to give a lower bound for inf{ϕ µ (x) : x ∈ B t (µ)} it suffices to give a lower bound for µ(H), where H is any closed half-space whose bounding hyperplane supports B t (µ).In that case, for some x ∈ ∂(B t (µ)), with ξ = ∇Λ * µ (x), or equivalently x = ∇Λ µ (ξ) (see e.g.Theorem 23.5 and Corollary 23.5.1 in [28]).Note that µ({z From Markov's inequality we see that Moreover, since x is the barycenter of µ ξ , Grünbaum's lemma (see [10, Lemma 2.2.6]) implies that Therefore, We would like now an upper bound for sup ξ σ ξ .We can have this when µ = µ K is the uniform measure on a centered convex body K of volume 1 on R n , using a theorem of Nguyen [25] (proved independently by Wang [31]; see also [16]).
Theorem 3.4.Let ν be a log-concave probability measure on R n with density g = exp(−p), where p is a convex function.Then, Var ν (p) n.
Using the fact that from Theorem 3.4 we get that Then, combining (3.1), (3.2) and (3.3), for any bounding hyperplane H of B t (µ) we have and taking into account Lemma 3.1 we have the next two-sided estimate.
Corollary 3.5.Let K be a centered convex body of volume 1 in R n .Then, for every x ∈ int(K) we have that Note.A basic question that arises from the results of this section is whether an analogue of (3.5) holds true for any centered log-concave probability measure µ on R n .This would allow us to apply the next steps of the procedure that our approach suggests to all log-concave probability measures.

Moments of the Cramer transform
As explained in the introduction, we would like to know for which centered log-concave probability measures µ on R n we have that Λ * µ has finite moments of all orders.Our first result provides an affirmative answer in the case where µ = µ K is the uniform measure on a centered convex body K of volume 1 in R n .In fact, the next theorem covers a more general case.Theorem 4.1.Let K be a centered convex body of volume 1 in R n .Let κ ∈ (0, 1/n] and let µ be a centered κ-concave probability measure with supp(µ) = K.Then, In particular, for all p 1 we have that The proof of Theorem 4.1 is based on the next lemma, which is proved in [11,Lemma 7] in the symmetric case.
Lemma 4.2.Let K be a centered convex body of volume 1 in R n .Let κ ∈ (0, 1/n] and let µ be a centered κ-concave probability measure with supp(µ) = K.Then, for every x ∈ K, where x K is the Minkowski functional of K.
Proof of Theorem 4.1.From Lemma 3.1 we know that ϕ µ (x) exp(−Λ * µ (x)), or equivalently, for all x ∈ K. From Lemma 4.2 we know that Recall that the cone probability measure ν K on the boundary ∂(K) of a convex body K with 0 ∈ int(K) is defined by for all Borel subsets B of ∂(K).We shall use the identity g(rx) dν K (x) dr which holds for every integrable function g : R n → R (see [24,Proposition 1]).Let f denote the density of µ on K.We write and the proof is complete.
In the case of the uniform measure µ = µ K on a centered convex body K of volume 1 in R n we see that where c 1 , c 2 > 0 are absolute constants.This gives the following estimate for the moments of Λ * µK : for all p 1.However, essentially repeating the argument that we used for Theorem 4.1 we may obtain sharp estimates in the most interesting case p = 1 or 2. We need the next lemma.
Proof.We consider the beta integral and differentiate it with respect to y.Then, the desired integrals are equal to We have where ψ(y) = Γ ′ (y) Γ(y) is the digamma function.Moreover, Recall (see e.g.[1]) that for all n 1.Therefore, as claimed.
Theorem 4.4.Let K be a centered convex body of volume 1 in R n , n 2. Let κ ∈ (0, 1/n] and let µ be a centered κ-concave probability measure with supp(µ) = K.Then, where c > 0 is an absolute constant and f is the density of µ.
Proof.Following the proof of Theorem 4.1 we write If f is the density of µ on K and ν K is the cone measure of K, using the inequality ln 2 (ab) 2(ln 2 a + ln 2 b) where a, b > 0, we may write where c 1 > 0 is an absolute constant.This completes the proof.
Our next result concerns the one-dimensional case.Let µ be a centered probability measure on R which is absolutely continuous with respect to Lebesgue measure and consider a random variable X, on some probability space (Ω, F , P ), with distribution µ, i.e., µ(B) := P (X ∈ B), B ∈ B(R).We define Thus, −α − , α + are the endpoints of the support of µ.Note that we may have α ± = +∞.We define In fact, since tx − Λ µ (t) < 0 for t < 0 when x ∈ [0, α + ), we have that Λ where I µ = supp(µ).In particular, for all p 1 we have that

It follows that
Iµ e Λ * µ (x)/2 dµ(x) Write f for the density of µ with respect to Lebesgue measure.Then, (1 − F ) ′ (x) = −f (x), which implies that In the same way we check that

This shows that
Iµ In a similar way we obtain the same upper bound for the second summand in (4.2) and the result follows.
Proposition 4.5 can be extended to products.Let µ i , 1 i n be centered probability measures on R, all of them absolutely continuous with respect to Lebesgue measure.
for all x = (x 1 , . . ., x n ) ∈ I µ , which implies that In particular, for all p 1 we have that We close this section with one more case where we can establish that Λ * µ has finite moments of all orders.We consider an arbitrary centered log-concave probability measure on R n but we have to impose some conditions on the growth of its one-sided L t -centroid bodies Z + t (µ).Recall that for every t 1, the one-sided L t -centroid body Z + t (µ) of µ is the convex body with support function For a proof of these claims see [19].The condition we need is that the family of the one-sided L t -centroid bodies grows with some mild rate as t → ∞ (note that the assumption in the next proposition can be satisfied only for log-concave probability measures µ with support supp(µ) = R n ).
Proposition 4.6.Let µ be a centered log-concave probability measure on R n .Assume that there exists an increasing function g for every p 1.
Proof.We use the following fact, proved in [9, Lemma 4.3]: If t 1 then for every x ∈ 1 2 Z + t (µ) we have where Since lim t→∞ g(t) = +∞, there exists t 0 c 1 such that µ g(t0/c1) 2 Z + 2 (µ) 2/3.From Borell's lemma [10, Lemma 2.4.5]we know that, for all t t 0 , where c 2 > 0 is an absolute constant.We write for all t t 0 .Since lim t→∞ g(t)/ ln(t + 1) = +∞, there exists t p t 0 such that for all t t p .Assume that p > 2.Then, from the previous observations we get This proves the result for p > 2 and then from Hölder's inequality it is clear that the assertion of the proposition is also true for all p 1.
Note.It is not hard to construct examples of log-concave probability measures, even on the real line, for which supp(µ) = R n but the assumption of Proposition 4.6 is not satisfied.Consider for example a measure µ on R with density f (x) = c • exp(−p) where p is an even convex function rapidly increasing to infinity, e.g.
However, this does not exclude the possibility that for every centered log-concave probability measure µ on R n the function Λ * µ has finite second or higher moments.

Threshold for the measure: the approach and examples
For any log-concave probability measure µ on R n we define the parameter (5.1) One of the main results in [9] states that if µ is a log-concave probability measure on R n then where c > 0 is an absolute constant.In fact, the proof of this estimate starts with Lemma 3.1 and follows from the next stronger result: If n n 0 then where L µ is the isotropic constant of µ and c > 0, n 0 ∈ N are absolute constants.Then, Jensen's inequality implies that We will need this lower bound for E µ (Λ * µ ).
Lemma 5.1.Let µ be a log-concave probability measure on R n , n n 0 .Then, where L µ is the isotropic constant of µ and c > 0, n 0 ∈ N are absolute constants.
We will also need a number of observations in the case µ = µ K where K is a centered convex body of volume 1 in R n .The next lemma provides a lower bound for Var(Λ * µK ).
Lemma 5.2.Let K be a centered convex body of volume 1 in R n .Then, where c > 0 is an absolute constant.
Proof.Borell has proved in [5, Theorem 1] that if T is a convex body in R n and f is a non-negative, bounded and convex function on T , not identically zero and with min(f ) = 0, then the function Consider a centered convex body K of volume 1 in R n .Applying Borell's theorem for the function Λ * µK on rK, r ∈ (0, 1) and the triple p = 0, 1 and 2, and finally letting r → 1 − , we see that Then, taking into account Lemma 5.1 we obtain the result.
Recall the definition of ω µK = ln(1/ϕ µK ) in (3.4) and consider the parameter The next lemma shows that we can estimate β(µ K ) if we can compute τ (µ K ).
Lemma 5.3.Let K be a centered convex body of volume 1 in R n .Then, Proof.From Corollary 3.5 we know that if K is a centered convex body of volume 1 in R n then for every cY for an absolute constant c > 0. Lemma 5.1 and the fact that Taking also into account the fact that Combining the above we see that t}, where Λ * µ is the Cramer transform of µ.A version of the next lemma appeared originally in [14].
Lemma 5.4.Let t > 0. For every N > n we have We use the lemma in the following way.Let m := E µ (Λ * µ ).Then, for all ε ∈ (0, 1), from Chebyshev's inequality we have that Equivalently, ).We distinguish two cases: Then, from Lemma 5.4 we see that . By the choice of ε we conclude that Then, exactly as in (i), we see that Note that 1 > ε β(µ) 2 , and hence where 3) is satisfied.Therefore, we conclude that We summarize the above in the next theorem.
Theorem 5.5.Let µ be a log-concave probability measure on R n .
(i) Let β(µ) < 1/8 and 8β(µ Remark 5.6.Paouris and Valettas have proved in [26,Theorem 5.6] that if µ is a log-concave probability measure on R n and p is a convex function on R n then (5.4) for all t > 0, where M (p) is a median of p with respect to µ.Consider the parameter provided that t β(µ)M ln(2/δ).Now, we restrict our attention to the case where µ = µ K is the uniform measure on a centered convex body K of volume 1 in R n .Then, Lemma 5.2 shows that A natural question is to examine how close E(Λ * µ ) and M (Λ * µ ) are.This would allow us to compare (5.6) with Theorem 5.5 at least in the case of the uniform measure on a convex body.From [10, Lemma 2.4.10]we know that 1 for any Borel probability measure µ on R n .Therefore, if we assume that β(µ K ) η some small enough η ∈ (0, 1), we see that . This gives a variant of Theorem 5.5 with a much better dependence on δ.
Theorem 5.7.Let K be a centered convex body of volume 1 in R n and let δ ∈ (0, 1) and η ∈ (0, 1/9).If For the proof of the lower threshold we need a basic fact that plays a main role in the proof of all the lower thresholds that have been obtained so far.It is stated in the form below in [11, Lemma 3].For a proof see [14] or [18,Lemma 4.1].
Lemma 5.8.For every Borel subset A of R n we have that Therefore, In order to apply Lemma 5.8 we note that if m := E µ (Λ * µ ) then as before, for all ε ∈ (0, 1), from Chebyshev's inequality we have that If β(µ) < 1/2 and 2β(µ) < δ < 1 then, choosing ε = 2β(µ)/δ we have that Therefore, we will have that for all N N 0 := exp((1 + 2ε)m).Recall that in the case of the uniform measure on a centered convex body of volume 1, Theorem 3.2 shows that inf We require that n and m are large enough so that 1/2 n < δ/2 and 2 √ n Setting x := N/n we see that this last is equivalent to e (1+3ε/2)m < x − 1 10 ln(2ex) .
One can now check that if N exp((1 + 2ε)m) then all the restrictions are satisfied if we assume that n/L 2 µK c 2 ln(2/δ) δ/β(µ K ).In this way we get the following.
An estimate analogous to the one in Theorem 5.5 (ii) is also possible but we shall not go through the details.From the discussion in this section it is clear that our approach is able to provide good bounds for the threshold ̺(µ, δ) if the parameter β(µ) is small, especially if β(µ) = o n (1) as the dimension increases.We illustrate this with a number of examples.Therefore, as n → ∞.Then, Theorem 5.5 and Theorem 5.9 show that for any δ ∈ (0, 1) there exists n 0 (δ) such that, for any n n 0 , ̺ 1 (µ Cn , δ) Combining the above we get where C > 0 is an absolute constant.
We close this article with the example of the uniform measure on the Euclidean ball.It was proved in [3] that if ε ∈ (0, 1) and K N = conv{x 1 , . . ., x N } where x 1 , . . ., x N are random points independently and uniformly chosen from We shall obtain a similar conclusion with the approach of this work (the estimate below is in fact stronger since it sharpens the width of the threshold from O(1) to O(1/ √ n)).Proof.Note that if K is a centered convex body in R n and r > 0 then Λ * µrK (x) = Λ * µK (x/r) for all x ∈ R n , where µ rK is the uniform measure on rK.It follows that

εm 2 .
Using also the fact that N n e −1 eN n n we see that (5.7) will be satisfied if we also have 2eN n n e − N −n 10 e −(1+3ε/2)m < 1.

Theorem 5 . 12 .
Let D n be the centered Euclidean ball of volume 1 in R n .Then, the sequence µ n := µ Dn exhibits a sharp threshold with ̺(µ n , δ) c √ δn and e.g. if n is even then we have that E µn (Λ * µn )