Abstract
The law of maturity is the belief that lessobserved events are becoming mature and, therefore, more likely to occur in the future. Previous studies have shown that the assumption of infinite exchangeability contradicts the law of maturity. In particular, it has been shown that infinite exchangeability contradicts probabilistic descriptions of the law of maturity such as the gambler’s belief and the belief in maturity. We show that the weaker assumption of finite exchangeability is compatible with both the gambler’s belief and belief in maturity. We provide sufficient conditions under which these beliefs hold under finite exchangeability. These conditions are illustrated with commonly used parametric models.
This is a preview of subscription content, access via your institution.
References
Bernardo, J., & Smith, A. (1994). Bayesian theory. Wiley Series in Probability and Mathematical Statistics. New York: Wiley.
Brooks, R. J., James, W. H., & Gray, E. (1991). Modelling subbinomial variation in the frequency of sex combinations in litters of pigs. Biometrics, 47, 403–417.
de Finetti, B. (1931). Funzione caratteristica di un fenomeno aleatorio. Atti della R Academia Nazionale del Linceo, 6, 251–299.
Diniz, C. A., Tutia, M. H., Leite, J. G., et al. (2010). Bayesian analysis of a correlated binomial model. Brazilian Journal of Probability and Statistics, 24(1), 68–77.
Iglesias, P., Loschi, R., Pereira, C., & Wechsler, S. (2009). A note on extendibility and predictivistic inference in finite populations. Brazilian Journal of Probability and Statistics, 23(2), 216–226.
Kadane, J. B. (2014). Sums of possibly associated bernoulli variables: The conwaymaxwellbinomial distribution. arXiv: http://arxiv.org/abs/14041856.
Kalra, A., & Shi, M. (2010). Consumer valuemaximizing sweepstakes and contests. Journal of Marketing Research, 47(2), 287–300.
Lee, J., & Lio, Y. (1999). A note on bayesian estimation and prediction for the betabinomial model. Journal of Statistical Computation and Simulation, 63(1), 73–91.
Lindley, D., & Phillips, L. (1976). Inference for a Bernoulli process (a Bayesian view). The American Statistician, 30(3), 112–119.
Mendel, M. (1994). Operational parameters in Bayesian models. Test, 3(2), 195–206.
Militana, E., Wolfson, E., & Cleaveland, J. (2010). An effect of intertrial duration on the gambler’s fallacy choice bias. Behavioural Processes, 84(1), 455–459.
O’Neill, B., & Puza, B. (2005). In defence of the reverse gambler’s belief. Mathematical Scientist, 30(1), 13–16.
Oppenheimer, D., & Monin, B. (2009). The retrospective gambler’s fallacy: Unlikely events, constructing the past, and multiple universes. Judgment and Decision Making, 4(5), 326–334.
Rabin, M., & Vayanos, D. (2010). The gambler’s and hothand fallacies: Theory and applications. Review of Economic Studies, 77(2), 730–778.
Rodrigues, F., & Wechsler, S. (1993). A discrete Bayes explanation of a failurerate paradox. IEEE Transactions on Reliability, 42(1), 132–133.
Shmueli, G., Minka, T. P., Kadane, J. B., Borle, S., & Boatwright, P. (2005). A useful distribution for fitting discrete data: revival of the Conway–Maxwell–Poisson distribution. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(1), 127–142.
Acknowledgments
Partially supported by CNPq and FAPESP (2003/101052). The authors thank Dani Gamerman, Jay Kadane, Carlos Pereira, Teddy Seidenfeld, Julio Stern and Robert Winkler for insightful remarks.
Author information
Authors and Affiliations
Corresponding author
Appendix: Proofs
Appendix: Proofs
Proof (Proposition 1)
Consider that the coordinates of \(\mathtt X _{N}\) are jointly independent. Since \(\mathtt X _{N}\) is finitely exchangeable, the coordinates of \(\mathtt X _{N}\) are identically distributed. Thus, since \(\gamma = \sum _{i=1}^{N}{X_{i}}\) and \(X_{i}\) are i.i.d., conclude that \(\gamma \sim \text {Binomial}(N,P(X_{1}=1))\). Hence, under the assumption of independence, there exists \(\pi \in [0,1]\) such that \(\gamma \sim \text {Binomial}(N,\pi )\).
Also observe that, since \(\mathtt X _{N}\) is exchangeable, the distribution of \(\mathtt X _{N}\) is completely specified by the distribution of \(\gamma \). Hence, there exists a unique distribution on \(\mathtt X _{N}\) for each distribution on \(\gamma \). Conclude from the last paragraph that, if \(\gamma \sim \text {Binomial}(N,\pi )\), then the coordinates of \(\mathtt X _{N}\) are independent.
The proof of Proposition 1 follows from the implications proved in the two previous paragraphs. \(\square \)
Proof (Propostion 2)
In order to prove Proposition 2 we first show that the statement that \(\mathtt X _{N}\) models indifferent belief is equivalent to the joint independence of the coordinates of \(\mathtt X _{N}\). Observe that, by definition, the statement that \(\mathtt X _{N}\) models indifferent belief implies that the coordinates of \(\mathtt X _{N}\) are jointly independent. Also, since \(\mathtt X _{N}\) is finitely exchangeable, \(P(X_{i} = 1) = P(X_{j} = 1)\). Hence, joint independence of the coordinates of \(\mathtt X _{N}\) implies that \(\mathtt X _{N}\) models indifferent belief.
The proof of Proposition 2 follows from the equivalence that is proved in the previous paragraph and the direct application of Proposition 1. \(\square \)
Lemma 1
Let \(M = \min \{i \le N: X_{i}=1\}\) be the first trial in which a success is observed. If \(\gamma \) is tighter than the Binomial(\(N\), 1/2), then
Similarly, if \(\gamma \) is looser than the Binomial(\(N, 1/2\)), then
Proof (Lemma 1)
Let \(t(k) = P(\gamma = k)/{N \atopwithdelims ()k}\). Observe that
Hence,
Consider the case in which \(\gamma \) is tighter than the Binomial\((N,1/2)\). In order to prove the lemma, it is sufficient to show the following: (1) the first sum in the numerator divided by the first sum in the denominator is greater than 1/2, and (2) if \(m < N/2+1\), then the second sum in the numerator divided by the second sum in the denominator is greater than 1/2.

(1)
If \(k < (N+1)/2\), since \(\gamma \) is tighter than the Binomial\((N,1/2)\), then conclude that \(\frac{t(k)}{t(k)+t(k1)} > 1/2\). Hence, for every \(1 \le k \le \min (m1,Nm+1)\), \(\frac{{Nm \atopwithdelims ()k1}t(k)}{{Nm \atopwithdelims ()k1}(t(k)+t(k1))} > 1/2\).

(2)
Since \(m < N/2+1\),
$$\begin{aligned}&\frac{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle Nm+1}}{Nm \atopwithdelims ()k1} t(k)}{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle Nm+1}} {Nm \atopwithdelims ()k1} (t(k)+t(k1)) } \nonumber \\&\quad = \frac{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle \lfloor N/2 \rfloor }}\left[ {Nm \atopwithdelims ()k1} t(k)+{Nm \atopwithdelims ()Nk}t(Nk+1)\right] }{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle \lfloor N/2 \rfloor }} \left[ {Nm \atopwithdelims ()k1} (t(k)+t(k1))+{Nm \atopwithdelims ()Nk}(t(Nk+1)+t(Nk))\right] } \end{aligned}$$(1)
Notice that \({Nm \atopwithdelims ()k1} = {Nm \atopwithdelims ()Nmk+1}\). Also, since \(m \le N/2\) and \(k < (N+1)/2\), \({Nm \atopwithdelims ()Nkm+1} > {Nm \atopwithdelims ()Nk}\). Since \(\gamma \) is symmetric, \(\frac{t(k)+t(Nk+1)}{t(k)+t(k1)+t(Nk+1)+t(Nk)} = 1/2\). Also, since \(k < N/2+1\) and \(\gamma \) is tighter than the Binomial\((N,1/2)\), conclude that \(\frac{t(k)}{t(k)+t(k1)} > 1/2\). Hence, in Eq. 1, the ratio between each term in the numerator and each term in the denominator is greater than 1/2.
When \(\gamma \) is looser than the Binomial\((N,1/2)\), \(\frac{t(k)}{t(k)+t(k1)} < 1/2\). Hence, all the inequalities are reversed. \(\square \)
Lemma 2
If \(\gamma \) is tighter than the Binomial\((N,1/2)\), then the gambler’s belief holds. If \(\gamma \) is looser than the Binomial\((N,1/2)\), then the reverse gambler’s belief holds.
Proof (Lemma 2)
Without loss of generality, assume that the number of \(0\)’s in \(\mathtt x _{n}\) is larger than the number of \(1\)’s. Since the model is exchangeable, for any permutation \(\pi \) of \(\{1,\ldots ,n\}\), \(P(\gamma =\gamma _{0}\mathtt X _{i}=\mathtt x _{i}) = P(\gamma =\gamma _{0}\mathtt X _{i}=\mathtt x _{\pi (i)})\). Consider a permutation \(\pi \) and \(\mathtt y = \mathtt x _{\pi }\) such that, for some \(a\), \(\mathtt y _{1}^{a}\) has an equal number of \(0\)’s and \(1\)’s, and \(\mathtt y _{a+1}^{n}\) only has \(0\)’s. Let \(\gamma ^{*} = \sum _{i=a+1}^{N}{X_{i}}\).
That is, \(P(X_{n+1}=1\mathtt x )\) is equal to \(P(X_{n+1}=1\mathtt y _{a+1}^{n})\) using \(P(\gamma ^{*}=i\mathtt y _{1}^{a})\) as a prior for \(\gamma ^{*}\). Observe that
The last equality follows since \(y_{1}^{a}\) has same number of \(1\)’s and \(0\)’s. Hence, if \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\), then \(\gamma ^{*}y_{1}^{a}\) is tighter (looser) than the Binomial\((Na,1/2)\). Using \(P(\gamma ^{*}=i\mathtt y _{1}^{a})\) as a prior for \(\gamma ^{*}\), conclude from Lemma 1 that, if \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\), then \(P(X_{n+1}=1\mathtt y _{a+1}^{n}) > (<) 1/2\). \(\square \)
Lemma 3
Assume the distribution of \(\gamma \) is symmetric on \(N/2\). If the (reverse) gambler’s belief holds, then \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\).
Proof (Lemma 3)
Let \(\mathtt x _{N1} \in \{0,1\}^{N1}\) and \(s_{N1}1 = \sum _{i=1}^{N1}\mathtt{x _{N1}}\).
If the gambler’s relief holds, then \(P(X_{N}=1\mathtt X _{N1}=\mathtt x _{N1}) > \frac{1}{2}\), for every \(s_{N1} \le \frac{N}{2}\). Hence, for every \(s_{N1} \le \frac{N}{2}, \frac{P(\gamma =s_{N1})}{P(\gamma =s_{N1}1)} > \frac{Ns_{N1}+1}{s_{N1}}\). Since the distribution of \(\gamma \) is symmetric around \(N/2\), conclude that \(\gamma \) is tighter than the Binomial\((N,1/2)\). Similarly, if the reverse gambler’s belief holds, then \(\frac{P(\gamma =s_{N1})}{P(\gamma =s_{N1}1)} < \frac{Ns_{N1}+1}{s_{N1}}\), and \(\gamma \) is looser than the Binomial\((N,1/2)\). \(\square \)
Proof (Theorem 1)
Follows from Lemmas 2 and 3. \(\square \)
Proof (Proposition 3)
Since the distribution of de Finetti’s parameter is exchangeable, the distribution of \(\gamma \) is symmetric with respect to \(N/2\). Hence, it remains to show that \(P(\gamma =i)/P(\gamma =i1) < (Ni+1)/y\), for \(1 \le i \le N/2\). First, observe that, for every \(0 \le \pi \le 1\) such that \(\pi \ne 0.5\) and \(n \ge 0\), it follows that \((\pi ^{n+1}(1\pi )^{n+1})(\pi (1\pi )) > 0\). Hence, developing this expression, \(\pi (1\pi )^{n+1} + (1\pi )\pi ^{n+1} < \pi ^{n+2} + (1\pi )^{n+2}\). Hence, for \(i \le N/2\),
Next, since \(\mathtt X _{n}\) can be extended to an infinitely exchangeable sequence \(\mathtt X \), one can apply de Finetti’s representation theorem (Finetti 1931) to \(\gamma \). That is, there exists a distribution \(Q\) on \([0,1]\) such that, for every \(i \in \mathbb {N}, P(\gamma = i) = \int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1\pi )^{Ni}Q(d\pi )}\). Thus,
The last inequality follows from Eq. 2. \(\square \)
Proof (Theorem 2)
Let \(M = \min \{i \le N: X_{i}=1\}\) be the first trial in which a success is observed and \(r(m) = P(M=mM \ge m)\). In order to verify belief in maturity, one must show that \(r(m)\) increases on \(m\). Let \(t(k) = P(\gamma =k)/{N \atopwithdelims ()k}\). Using the same development as in the proof of Lemma 1,
Observe that, in Eq. 3, the first sum in the numerator divided by the first sum in the denominator is equal to \(r(m+1)\). Therefore, to obtain \(r(m+1) > r(m)\), it is sufficient to prove the following:
which is equivalent to
If \(\gamma \) is 2ndorder tighter than the Binomial, for every \(1 \le k \le N1\), then
Hence, for every \(1 \le k \le N1\),
Hence, if \(\gamma \) is 2ndorder tighter than the Binomial, then Eq. 5 holds and, therefore, Eq. 4 also holds. Hence, if \(\gamma \) is 2ndorder tighter than the Binomial, then belief in maturity holds. If \(\gamma \) is 2ndorder looser than the Binomial, then the proof follows by reversing the inequality in Eq. 5. \(\square \)
Proof (Proof of Proposition 5)
Since \(\mathtt X _{n}\) can be extended to an infinitely exchangeable sequence \(\mathtt X \), one can apply de Finetti’s representation theorem (Finetti 1931) to \(\gamma \). That is, there exists a distribution \(Q\) on \([0,1]\) such that, for every \(i \in \mathbb {N}\), \(P(\gamma = i) = \int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1\pi )^{Ni}Q(d\pi )}\). Hence,
The last line follows from the Cauchy–Schwarz inequality. \(\square \)
Rights and permissions
About this article
Cite this article
Bonassi, F.V., Stern, R.B., Peixoto, C.M. et al. Exchangeability and the law of maturity. Theory Decis 78, 603–615 (2015). https://doi.org/10.1007/s1123801494414
Published:
Issue Date:
DOI: https://doi.org/10.1007/s1123801494414
Keywords
 Law of maturity
 Exchangeability
 Gambler’s fallacy
 Belief in maturity
 Bayesian statistics
 0–1 Process