Abstract
Let \(X_{1},X_{2},\ldots \) be independent random variables with \({E}X_{k}=0\) and \(\sigma _{k}^{\,2}:={E}X_{k}^2<\infty \) \((k\ge 1)\). Set \(S_k=X_1+\cdots +X_k\) and assume that \(s_{k}^{\,2}:={E}S_k^2\rightarrow \infty \). We prove that under the Kolmogorov condition
we have
for any almost everywhere continuous function \(f: {\mathbb R} \rightarrow {\mathbb R}\) satisfying \(|f(x)|\le e^{\gamma x^2}\), \(\gamma <1/2\). We also show that replacing the o in (1) by O, relation (2) becomes generally false. Finally, in the case when (1) is not assumed, we give an optimal condition for (2) in terms of the remainder term in the Wiener approximation of the partial sum process \(\{S_n, \, n\ge 1\}\) by a Wiener process.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Let \(X_1,X_2,\ldots \) be i.i.d. random variables with \({E}X_1=0\), \({E}X_1^2=1\) and put \(S_n=X_1+\cdots +X_n\). In its simplest form, the almost sure central limit theorem states that for any \(x \in \mathbb R\)
where I denotes indicator function and \(\Phi (x)\) is the standard normal distribution function. Relation (3) was proved by Brosamler [7] and Schatte [17] under slightly stronger moment assumptions and by Lacey and Philipp [11] under finite variances.
In past decades, a wide theory of a.s. limit theorems of the type (3) has been developed and several extensions and improvements in (3) were proved. In particular, it turned out that under mild moment assumptions (see (5)), any weak limit theorem
for partial sums \(S_n\) of independent random variables \(X_1, X_2, \ldots \) has an almost sure version
where the weights \(d_n\) are determined by the norming sequence \((b_n)\) and \(D_n=\sum _{k=1}^n d_k\). For \((b_n)\) growing with polynomial speed we have \(d_n=1/n\) (see Berkes and Dehling [6]); assuming only \(b_n\uparrow \infty \), (4) holds with \(d_k= (b_{k+1}-b_k)/b_k\) (Ibragimov and Lifshits [9], Lifshits [12]). The last paper also proves the optimality of the moment assumption
assumed in [6, 12], showing the surprising role of loglog moments in ASCLT theory.
In particular, if \(X_1, X_2, \ldots \) are independent random variables with \({E}X_n=0\), \(\sigma _n^2={E}X_n^2<\infty \), \(s_n^2={E}S_n^2\) and \((X_n)\) satisfies the Lindeberg condition
then we have
See Atlagh [1], Atlagh and Weber [2] and Major [15]. Formally, the weights in (6) are different from those in (3), but letting \(\{S(t), \, t\ge 0\}\) denote the partial sum process defined by
relation (6) implies
and thus, we have again an a.s. limit theorem involving logarithmic averages. For critical weight sequences in ASCLT theory, see Hörmann [8]. For an extension for nonlinear limit theorems of the form
see Berkes and Csáki [4].
A generalization of (3) in another direction was given by Schatte [18], who proved that if \(X_1, X_2, \ldots \) are i.i.d. random variables with \({E}X_1=0\), \({E}X_1^2=1\), \({E}|X_1|^3<\infty \) and \(f: {\mathbb R}\rightarrow {\mathbb R}\) is a function almost everywhere continuous with respect to the Lebesgue measure satisfying
then letting \(S_n=\sum _{k=1}^n X_k\) we have
Berkes, Csáki and Horváth [5] showed that \(\gamma <1/4\) in (8) can be replaced by \(\gamma <1/2\) and showed also that the assumption \({E}|X_1|^3<\infty \) can be dropped. Ibragimov and Lifshits [10] proved that the finiteness of the integral in (9) plus the assumption that \(f(x) e^{-Hx^2}\) is nonincreasing for some \(H>0\) is sufficient in (9); on the other hand, the finiteness of the integral itself is not, even if f is continuous and \(P(X_1=1)=P(X_1=-1)=1/2\). The purpose of the present paper is to study the problem for independent, not necessarily identically distributed random variables with finite variances and to give an optimal condition for the ASCLT. More precisely, we will prove the following theorem.
Theorem 1
Let \(X_{1},X_{2},\ldots \) be independent random variables with \({E}X_{n}=0\) and \({E}X_{n}^2=:\sigma _{n}^{\,2}<\infty \). Set \(S_n=X_1+\cdots +X_n\) and \(s_{n}^{\,2}:={E}S_n^2\). Assume that \(s_{n}^{\,2}\rightarrow \infty \) and
Then, we have
for any almost everywhere continuous function \(f:\mathbb {R}\rightarrow \mathbb {R}\) satisfying
If (10) only holds with O instead of o, then (11) is generally false.
In terms of the bounds for the random variables \(X_n\), Theorem 1 provides an optimal condition for the ASCLT (11). Note that condition (10) is Kolmogorov’s classical condition for the LIL
(see, e.g., [13], p. 272) which, in view of the results of Marcinkiewicz and Zygmund [16] and Weiss [20], is also optimal. This indicates a strong connection between the LIL and ASCLT, a connection which will become very clear by the proof of Theorem 1. As we will see, however, the LIL (13) itself does not imply (11) and neither does the strong approximation
where W is a Wiener process defined on the same probability space as the sequence \((X_n)\) and \(\{S(t), t\ge 0\}\) is the partial sum process defined by (7). The approximation (14) was proved by Strassen [19] in case of i.i.d. sequences \((X_n)\) with mean 0 and variance 1, and he showed that it implies a large class of refinements of the LIL. Major [14] proved that (14) holds also under the Kolmogorov condition (10). As we will see, in the case when (10) does not hold, the validity of (11) is closely connected with the order of Wiener approximability of the partial sums \(S_n\). We will namely prove the following results.
Theorem 2
Let \(X_{1},X_{2},\ldots \) be independent, zero mean random variables and define \(\sigma _n^2\) and \(s_n^2\) as in Theorem 1. Assume that
and there exists a Wiener process W such that
Then, for any a.e. continuous function \(f: {\mathbb R} \rightarrow {\mathbb R}\) satisfying (12) we have (11). On the other hand, replacing (16) with
with any fixed function \(\psi (n)\rightarrow \infty \), (11) becomes generally false.
As the proof will show, the first half of Theorem 2 remains valid if \(f(x) = e^{\max \{x^2/2 - cx, 0\}}\), \(c > 0\).
Theorem 3
Under the setup of Theorem 2, assume that (15) holds and there exists a Wiener process W such that
Then (11) holds for any measurable function \(f:{\mathbb R}\rightarrow {\mathbb R}\) such that the integral on the right side of (11) is finite and \(f(x)=e^{x^2/2} g(x)\), where \(\log g(x)\) is uniformly continuous on \(\mathbb R\).
2 Proof of the theorems
Proof of Theorem 1
By an argument in Schatte [18], it suffices to prove the theorem in the case \(f(x)=e^{\gamma x^2}\), \(0<\gamma <1/2\). We first note that by the Kolmogorov exponential bounds (see, e.g., Loève [13], p. 266) the assumptions made on the sequence \((X_n)\) in Theorem 1 imply that given any \(\epsilon >0\), \(K>0\), we have for \(n\ge n_0(\epsilon , K)\)
For \(1\le k\le l \), we put
For a fixed \(\delta >0\), we set
and define the truncated variable
Lemma 1
There exist a constant \(\eta >0\) and an integer \(k_0\) such that
and
Proof
We follow [10]. Define, similar to (21),
and put
Clearly \(\xi _l - \xi _{k, l} = \frac{s_k}{s_l} \xi _k\) and thus using the mean value theorem and \(f'(x) = 2\gamma x f(x)\), we get
Thus, the integral of |X| on the set \(\{|\xi _k| \le \left( s_l/s_k\right) ^{1/2}\}\) is at most
On the other hand, \(|X|\le 2 f(M_l)\) by (25), and thus the integral of |X| on the set \(\{|\xi _k| > (s_l/s_k)^{1/2}\}\), whose probability is at most \(s_k/s_l\) by the Chebysev inequality, is at most \((s_k/s_l) 2f(M_l)\). Therefore,
On the other hand, the independence of \(\hat{\xi }_k\) and \(\hat{\xi }_{k,l}\) implies that
and thus
Now choosing \(\delta \) sufficiently small, we have \((2+\delta )\gamma <1\) and thus \(M_l f(M_l)^2 \le (\log s_l)^{2-\eta }\) for some constant \(\eta >0\) and \(l\ge k\ge k_0\). Together with (26), this yields (22).
Now we prove (23). Let \(F_k^+\) denote the distribution function of \(|\xi _k|\). Integration by parts and using \(f(x)=e^{\gamma x^2}\) yields
Applying (19) with \(K=3\), we get
for a suitably small \(\eta \in (0,1)\) and large enough k, provided \(\epsilon ,\delta >0\) are chosen so small that \((4\gamma -1+\epsilon )(2+\delta )<2\). Also, for k large enough
as before, and thus (23) is proved.
Using Lemma 1, we can complete the proof of Theorem 1 by following the standard path of proving ASCLT’s. Put
Thus, we have for \(k_0\le k\le l\)
where the second relation follows from (23) and the Cauchy–Schwarz inequality. Let now
Note that (10) implies that \(s_{k+1}^2/s_k^2\rightarrow 1\). Hence, \(d_k \sim s_{k+1}^2/s_k^2-1 = \sigma _{k+1}^2/s_k^2\), and thus, in Theorem 1 it suffices to prove relation (11) with the weight \(\sigma _{k+1}^2/s_k^2\) replaced by \(d_k\) in (29). Clearly
Let \(0<\delta < \eta /2\). By the first relation of (28), the contribution of those terms on the right-hand side of (30) where for some \(\delta >0\)
is at most
On the other hand, since \(\frac{s_{k+1}}{s_k}\rightarrow 1\) we have that \(L = \sup \limits _{k \ge 1} \frac{s_{k + 1}}{ s_k}<\infty \). Hence, the inequality \(s_l / s_k \le \exp (2D_n^\delta )\) implies
and thus by the second relation of (28) the contribution of terms on the right-hand side of (30) where \(s_l / s_k \le \exp (2D_n^{\delta })\) is at most
for \(n\ge n_0\) by \(\delta <\eta /2\). Thus, using (31) and (32) we get for sufficiently large n
and thus, introducing the notation
we get
Since \(s_{\ell +1}/s_{\ell }\rightarrow 1\), we can select a sequence \((n_\ell )\) such that \( s_{n_\ell }^2 \sim e^{\ell ^{4/\eta }}, \) and then the Beppo Levi theorem implies that almost surely \(\sum _{\ell =1}^\infty T_{n_\ell }^2 <\infty \) and consequently \( T_{n_\ell } \rightarrow 0 \) as \(\ell \rightarrow \infty \). Hence
along the subsequence \((n_\ell )\).
Let \( m = \int _\mathbb {R} f(x) d\Phi (x). \) We now show that
Let \(F_k\) denote the distribution function of \(\xi _k=S_k/s_k\) and recall that we may fix \(f(x)=\exp (\gamma x^2)\). Clearly,
and by (19) and its analogue for the lower tail we obtain that for large enough k
which is o(1) if \(2\gamma <1-\epsilon \). On the other hand, we have that
and consequently,
using integration by parts. By the CLT (note that the Kolmogorov condition (10) implies the Lindeberg condition), we have
for fixed x when \(k\rightarrow \infty \), and further for \(|x|\le M_k\) we have
whose integral in \((-\infty ,\infty )\) is finite when \(\epsilon \) is small enough. Thus, the integral in (35) tends to zero by the dominated convergence theorem. The estimates in (37) also show that the first term in (35) tends to 0, and thus, (34) is proved.
We already proved (33) along the sequence \((n_\ell )\); thus, by (34) the relation
is also valid along \((n_\ell )\) when \(\ell \rightarrow \infty \). By the law of the iterated logarithm for \((X_n)\) (valid by the Kolmogorov condition (10)), we have \(P\left( \hat{\xi }_k \ne \xi _k \ \text {i.o.} \right) = 0,\) and thus,
along \((n_\ell )\). But since \(f > 0\) and by \(s_{n_\ell }^2 \sim e^{\ell ^4/\eta }\) we have \(\log s_{n_{\ell + 1}}^2 / \log s_{n_\ell }^2 \rightarrow 1\), relation (38) holds along the whole sequence of integers, i.e.,
Thus, the first half of Theorem 1 is proved.
To prove the second half of Theorem 1, we use an example due to Weiss [20], showing the sharpness of the basic assumption in Kolmogorov’s LIL. Let \(X_1,X_2,\ldots \) be independent random variables with
and
for a fixed \(\alpha >0\). Then simple calculations show (see [20], p. 123) that, using the notations of Theorem 1, we have
and
as \(k\rightarrow \infty \). Thus, (10) holds with o replaced by O. Also, in [20, Theorem 1] it is shown that if \(\alpha \) is sufficiently large, then there is a \(\delta >0\) such that almost surely
for infinitely many n. Now let \(f(x)=\exp (\gamma x^2)\) where \(\frac{1}{2+\delta }<\gamma <1/2\). If we take a fixed n which is large enough and satisfies (41), then using (39) and (40) we get that
This completes the proof of Theorem 1. \(\square \)
Proof of Theorem 3
Let, for the purposes of this proof, [t] denote the function which equals \(s_n^2\) for \(s_n^2 \le t < s_{n + 1}^2\) (\(n = 1,2,\dots \)). Then,
By \(s_{n + 1} / s_n \rightarrow 1\) we have \([t] / t \rightarrow 1\) as \(t \rightarrow \infty \) and (18) and the LIL for the Wiener process imply \(|S(t)| = O((t \log \log t)^{1/2})\) a.s. Thus, by (15) and the mean value theorem we get for \(s_n^2 \le t < s_{n + 1}^2\)
By our assumption, \(f(x)= e^{x^2/2} g(x)\), where \(\log g(x)\) is uniformly continuous on \(\mathbb R\). Thus, for \(n\rightarrow \infty \) we have, uniformly for \(s_n^2 \le t < s_{n + 1}^2\),
Thus,
, and consequently, the integral on the right-hand side of (42) is
Now an argument similar to (43) yields, using (18) and the fact that
by the LIL,
and thus, the integral in (45) is
By the ASCLT for W(t), we have
for any fixed \(A>0\). (This follows from the ergodic theorem by using the substitution \(t=e^u\) in the integral on the left-hand side and using the ergodicity of the Ornstein–Uhlenbeck process \(e^{-u/2}W(e^u)\).) Thus, the integral in (46), divided by \(\log s_n^2\), converges a.s. to \(\int _\mathbb {R}f(x) d \Phi (x)\), completing the proof of Theorem 3. \(\square \)
Proof of Theorem 2
As in the proof of the first half of Theorem 1, we may assume \(f(x)=e^{\gamma x^2/2}\), \(0\le \gamma <1/2\). Using (42) and an argument similar to (43) (with \(g = 1\) and the constant 1/2 in the second line replaced by \(\gamma \)), we get
Thus, for the proof of the first half of Theorem 2 it remains to show
Fix \(0< \varepsilon < 1\). By (16), there exists an a.s. finite random variable \(T = T(\varepsilon )\) such that
Let
Clearly, for \(t \ge T\) we have
and for \(t \ge T\), \(\bigl |W(t) / \sqrt{t}\bigr | \ge \varepsilon \) we have
As \(f \ge 1\) everywhere, it follows that for \(t \ge T\)
Now
and by \(\gamma < 1/2\) it follows that
Hence, applying the ASCLT (47) for \(f_i^{(\varepsilon )}\bigl (W(t) / \sqrt{t}\bigr )\), \(i = 1,2\), it follows from (49) that both the liminf and limsup of
as \(n \rightarrow \infty \), are between \(\int _\mathbb {R} f_1^{(\varepsilon )}(x) d \Phi (x)\) and \(\int _\mathbb {R} f_2^{(\varepsilon )}(x) d \Phi (x)\). Obviously, we have \(\lim \limits _{\varepsilon \rightarrow 0} f_1^{(\varepsilon )}(x) = \lim \limits _{\varepsilon \rightarrow 0} f_2^{(\varepsilon )}(x) = f(x)\) for every x, and thus, (50), (51) and the dominated convergence theorem imply
Thus, (48) is proved and the proof of the first half of Theorem 2 is completed.
As we noted after Theorem 2, the first half of the result remains valid for \(f(x) = e^{\max \{x^2/2 - cx,0\}}\), \(c > 0\). Relation (43) is easily modified in this case, and instead of \(0< \varepsilon < 1\), we can use \(0< \varepsilon < A\) and then \(e^{\gamma (|x| + 1)^2}\) in (50), (51) can be replaced by \(f(|x| + A)\). Then, \(\int _\mathbb {R} f(|x| + A)d\Phi (x) < \infty \) for sufficiently small A.
For the proof of the second half of Theorem 2, we need a lemma.
Lemma 2
Let \(\psi :\ {\mathbb R}^+ \rightarrow {\mathbb R}^+\) be a function with \(\lim \limits _{x \rightarrow \infty } \psi (x) = \infty \). Then, there exists a sequence \((X_n)\) of independent r.v.’s with mean zero and finite variances such that setting \(S_n = X_1 + \dots + X_n\) we have
for some Wiener process W, but for \(f(x) = |x|^p\), \(p>2\) and any \(\delta > 0\) we have
Proof
We use a construction similar to that used in Lifshits [12]. Since there exists a nondecreasing slowly varying function \(\psi _1:\ {\mathbb R}^+ \rightarrow {\mathbb R}^+\) with \(\psi _1 \le \psi \), \(\psi _1(x) = o(\log \log x)^{1/2}\) and \(\lim \limits _{x \rightarrow \infty } \psi _1(x) = \infty \), we can assume without loss of generality that \(\psi \) is nondecreasing, slowly varying, tends to \(+\infty \) and \(\psi (x) = o(\log \log x)^{1/2}\). Let \(2< \alpha < p\), \(n_k = 2^k\) and let \(Y_1, Y_2, \dots \) be independent random variables such that
and \(Y_n \equiv 0\) if \(n \ne n_1, n_2, \dots \). Let \(S_n^* = Y_1 + \dots + Y_n\). Clearly \(EY_{n_k}=0\),
and thus
since
by the slow variation of \(\psi \). Thus,
Also, for \(k\ge k_0\) we have
Since \(S_n^*\) is a sum of independent, symmetric random variables, we have \(P(S_{n_{k-1}}^*\ge 0)\ge 1/2\) and thus
Moreover, noting that for \(n_k \le n < n_{k + 1}\) the partial sum \(S_n^*\) does not change, (56) and the latter bound implies that for sufficiently large n
Let now W be a Wiener process independent of \(\{Y_k, k \ge 1\}\) and set \(S_n = W(n) + S_n^*\). Clearly, \(\{ S_n, n \ge 1\}\) is the partial sum sequence of a sequence \((X_n)\) of independent random variables with mean zero and finite variances and (58)–(60) show that (52)–(54) hold. To prove (55) with \(f(x) = x^p\), let us observe that by (54) and \(\psi (x) = o(\log \log x)^{1/2}\), the set of integration in (55) contains for \(n \ge n_0\) the set \(E_n = \{0 \le W(n) \le \sqrt{n}\}\), whose probability converges to \(\Phi (1) - \Phi (0) > 1/4\) as \(n \rightarrow \infty \). Thus, by (60) and the independence of W(n) and \(S_n^*\) the probability of the set
exceeds \(\frac{1}{16} \psi (n)^{-\alpha }\) for \(n \ge n_0\) and on \(E_n^*\) we have
Thus,
proving (55). This completes the proof of Lemma 2.
To prove the second half of Theorem 2, we will follow the argument in the proof of Theorem 1. Let \(\delta >0\), \(\psi (n)=o(\log \log n)^{1/2}\) and let \((X_n)\) be the sequence provided by Lemma 2. Let \(f(x)=e^{\gamma x^2}\) \((0\le \gamma <1/2)\), put \(s_k=\sqrt{k}\) and define \(M_k\), \(\xi _k\), \(\xi _{k, l}\), \(\hat{\xi }_k\), \(\hat{\xi }_{k, l}\) as in the proof of Theorem 1. We claim that Lemma 1 remains valid in the present case. The proof of (22) requires only trivial changes, since the argument there does not use the Kolmogorov condition (10). To prove (23), we set
and estimate the integral of \(f(\hat{\xi }_k)^2\) separately on \(B_k\) and \(C_k\). We first note that \(B_n\) is identical with the set of integration in (55) and by relation (54) on the set \(B_k\) we have for sufficiently large k
and consequently, we have on \(B_k\) by (54) and (61),
i.e.,
whence
Thus, we have to estimate the integral of \(\exp (2\gamma W(k)^2/k)\) on the set \(B_k\) where, according to the previous relations we have (61), i.e. we get, introducing the standard normal random variable \(\zeta = W(k)/\sqrt{k}\),
Thus, finally we get for sufficiently small \(\delta \), using \(\gamma <1/2\),
for some constant \(\eta >0\).
Next we estimate the integral of \(f(\hat{\xi }_k^2)\) on the set \(C_k\). Let us observe that in view of (54) and \(\psi (n)=o(\log \log n)^{1/2}\), on the set \(C_k\) we have for sufficiently small \(\delta \) and sufficiently large k,
since
for sufficiently small \(\delta \) by using the Taylor expansion of \(\sqrt{1+\delta /2}\). Thus, using the standard normal variable \(\zeta = W(k)/\sqrt{k}\) again, we have
On the other hand, on \(C_k\) we have
Thus, for sufficiently small \(\delta \) we get, using \(\gamma <1/2\),
for some \(\eta '>0\). Thus, (23) is proved.
With Lemma 1 established in the present case, we can follow now the proof of Theorem 1 to get, selecting a subsequence \((n_k)\) such that \( n_k \sim e^{k^{4/\eta }},\) the relation
along the subsequence \((n_k)\), where \(d_k\) and \(D_n\) are defined by (29) with \(s_k= \sqrt{k}\). However, the argument starting with (29) shows that replacing \(d_k\) and \(D_n\) in (29) by the analogous quantities \(d_k^*\sim d_k\), \(D_n^*\sim D_n\), defined with \(s_k^*\), relation (33) remains valid and thus (62) holds with the weights based on the true variances of the \(X_k\). On the other hand, on the set \(B_k=\left\{ |S_k| \le \left( (2+\delta ) k \log \log k\right) ^{1/2}\right\} \) we have \(\hat{\xi }_k=\xi _k\), and thus, (55) shows that with \(f(x)=|x|^p\), and even more for \(f(x)=\exp (\gamma x^2)\), \(0<\gamma <1/2\) we have
Thus, \(E f(\hat{\xi }_k)\rightarrow \infty \) and consequently
Thus, relation (62) yields
along the subsequence \((n_k)\). But
by the law of the iterated logarithm (implied by (54)), and thus, (63) yields
completing the proof of the second half of Theorem 2. \(\square \)
Data Availability
The data that support the findings of this study are available from the corresponding author upon reasonable request.
References
Atlagh, M.: Théorème central limite presque sûr et loi du logarithme itéré pour des sommes de variables aléatoires indépendantes (French) [Almost sure central limit theorem and associated law of the iterated logarithm for sums of independent random variables]. CR Acad. Sci. Paris Sér. I Math. 316(9), 929–933 (1993)
Atlagh, M., Weber, M.: Un théorème central limite presque sûr relatif à des sous-suites (French) [An almost sure central limit theorem relative to subsequences]. CR Acad. Sci. Paris Sér. I Math. 315(2), 203–206 (1992)
Berkes, I.: On the almost sure central limit theorem and domains of attraction. Probab. Theory Rel. Fields 102, 1–18 (1995)
Berkes, I., Csáki, E.: A universal result in almost sure central limit theory. Stoch. Proc. Appl. 94, 105–134 (2001)
Berkes, I., Csáki, E., Horváth, L.: Almost sure central limit theorems under minimal conditions. Stat. Prob. Letters 37, 67–76 (1998)
Berkes, I., Dehling, H.: Some limit theorems in log density. Ann. Probab. 21(3), 1640–1670 (1993)
Brosamler, G.: An almost everywhere central limit theorem. Math. Proc. Camb. Phil. Soc. 104, 561–574 (1988)
Hörmann, S.: Critical behavior in almost sure central limit theory. J. Theoret. Probab. 20, 613–636 (2007)
Ibragimov, I., Lifshits, M.: On almost sure limit theorems. Theory Prob. Appl. 44, 254–272 (1998)
Ibragimov, I., Lifshits, M.: On the convergence of generalized moments in almost sure central limit theorem. Stat. Probab. Letters 40, 343–351 (1998)
Lacey, M., Philipp, W.: A note on the almost everywhere central limit theorem. Stat. Prob. Letters 9, 201–205 (1990)
Lifshits, M.: A limit theorem of "almost sure" type for sums of random vectors. (Russian) Zap. Nauchn. Sem. S.-Petersburg. Otdel. Mat. Inst. Steklov. (POMI) 260 (1999), Veroyatn. i Stat. 3, 186–201, 320; translation in J. Math. Sci. (New York) 109 (6) 2166–2178 (2002)
Loève, M.: Probability theory I, \(4^{th}\) Edition. Springer, (1977)
Major, P.: A note on Kolmogorov’s law of iterated logarithm. Studia Sci. Math. Hungar. 12(1–2), 161–167 (1977)
Major, P.: Almost sure functional limit theorems. II. The case of independent random variables. Studia Sci Math Hungar 36, 231–273 (2000)
Marcinkiewicz, J., Zygmund, A.: Remarque sur la loi du logarithme itéré. Fund. Math. 29, 215–222 (1937)
Schatte, P.: On strong versions of the central limit theorem. Math. Nachr. 137, 249–256 (1988)
Schatte, P.: On the central limit theorem with almost sure convergence. Prob. Math. Stat. 11, 237–246 (1991)
Strassen, V.: An invariance principle for the law of the iterated logarithm. Z. Wahrsch. verw. Geb. 3, 211–226 (1964)
Weiss, M.: On the law of the iterated logarithm. J. Math. Mech. 8, 121–132 (1959)
Funding
Open access funding provided by ELKH Alfréd Rényi Institute of Mathematics.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Research supported in part by the Austrian Science Fund (FWF) P 35520-N. Research supported by NKFIH Grant K 125569.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Berkes, I., Hörmann, S. Some Optimal Conditions for the ASCLT. J Theor Probab 37, 209–227 (2024). https://doi.org/10.1007/s10959-023-01245-w
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10959-023-01245-w