Appendix: Proofs
Proof (Proposition 1)
Consider that the coordinates of \(\mathtt X _{N}\) are jointly independent. Since \(\mathtt X _{N}\) is finitely exchangeable, the coordinates of \(\mathtt X _{N}\) are identically distributed. Thus, since \(\gamma = \sum _{i=1}^{N}{X_{i}}\) and \(X_{i}\) are i.i.d., conclude that \(\gamma \sim \text {Binomial}(N,P(X_{1}=1))\). Hence, under the assumption of independence, there exists \(\pi \in [0,1]\) such that \(\gamma \sim \text {Binomial}(N,\pi )\).
Also observe that, since \(\mathtt X _{N}\) is exchangeable, the distribution of \(\mathtt X _{N}\) is completely specified by the distribution of \(\gamma \). Hence, there exists a unique distribution on \(\mathtt X _{N}\) for each distribution on \(\gamma \). Conclude from the last paragraph that, if \(\gamma \sim \text {Binomial}(N,\pi )\), then the coordinates of \(\mathtt X _{N}\) are independent.
The proof of Proposition 1 follows from the implications proved in the two previous paragraphs. \(\square \)
Proof (Propostion 2)
In order to prove Proposition 2 we first show that the statement that \(\mathtt X _{N}\) models indifferent belief is equivalent to the joint independence of the coordinates of \(\mathtt X _{N}\). Observe that, by definition, the statement that \(\mathtt X _{N}\) models indifferent belief implies that the coordinates of \(\mathtt X _{N}\) are jointly independent. Also, since \(\mathtt X _{N}\) is finitely exchangeable, \(P(X_{i} = 1) = P(X_{j} = 1)\). Hence, joint independence of the coordinates of \(\mathtt X _{N}\) implies that \(\mathtt X _{N}\) models indifferent belief.
The proof of Proposition 2 follows from the equivalence that is proved in the previous paragraph and the direct application of Proposition 1. \(\square \)
Lemma 1
Let \(M = \min \{i \le N: X_{i}=1\}\) be the first trial in which a success is observed. If \(\gamma \) is tighter than the Binomial(\(N\), 1/2), then
$$\begin{aligned} P(M=m|M \ge m)&> 1/2,\quad \text{ for }\quad m=2,\ldots , N \end{aligned}$$
Similarly, if \(\gamma \) is looser than the Binomial(\(N, 1/2\)), then
$$\begin{aligned} P(M=m|M \ge m)&< 1/2, \quad \text{ for } \quad m=2,\ldots , N \end{aligned}$$
Proof (Lemma 1)
Let \(t(k) = P(\gamma = k)/{N \atopwithdelims ()k}\). Observe that
$$\begin{aligned} P(M=m)={\displaystyle \sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1}t(k). \end{aligned}$$
Hence,
$$\begin{aligned}&P(M=m|M \ge m) = \frac{\sum _{k=1}^{N-m+1}{{N-m \atopwithdelims ()k-1}t(k)}}{t(0)+ \sum _{k=1}^{N-m+1}{\sum _{i=m}^{N-k+1}{{N-i \atopwithdelims ()k-1} t(k)}}}\\&=\frac{{\sum _{k=1}^{N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{t(0)+ {\sum _{k=1}^{N-m+1}} {N-m+1 \atopwithdelims ()k} t(k) } = \frac{{\sum _{k=1}^{N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{{\sum _{k=1}^{N-m+1}} {N-m \atopwithdelims ()k-1} (t(k)+t(k-1))}\\&= \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle min(m-1,N-m\!+1)}}{N-m \atopwithdelims ()k-1} t(k) \!+ \left[ {\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1} t(k)\right] \mathbf {I}_{(m < \ N/2 + 1)} }{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle min(m-1,N-m+1)}} {N-m \atopwithdelims ()k-1} (t(k)+t(k-1)) \!+\! \left[ {\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m\!+\!1}} {N-m \atopwithdelims ()k-1} (t(k)\!+\!t(\text{ k-1 }))\right] \mathbf {I}_{(m < \ N/2 + 1)}}. \end{aligned}$$
Consider the case in which \(\gamma \) is tighter than the Binomial\((N,1/2)\). In order to prove the lemma, it is sufficient to show the following: (1) the first sum in the numerator divided by the first sum in the denominator is greater than 1/2, and (2) if \(m < N/2+1\), then the second sum in the numerator divided by the second sum in the denominator is greater than 1/2.
-
(1)
If \(k < (N+1)/2\), since \(\gamma \) is tighter than the Binomial\((N,1/2)\), then conclude that \(\frac{t(k)}{t(k)+t(k-1)} > 1/2\). Hence, for every \(1 \le k \le \min (m-1,N-m+1)\), \(\frac{{N-m \atopwithdelims ()k-1}t(k)}{{N-m \atopwithdelims ()k-1}(t(k)+t(k-1))} > 1/2\).
-
(2)
Since \(m < N/2+1\),
$$\begin{aligned}&\frac{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m+1}} {N-m \atopwithdelims ()k-1} (t(k)+t(k-1)) } \nonumber \\&\quad = \frac{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle \lfloor N/2 \rfloor }}\left[ {N-m \atopwithdelims ()k-1} t(k)+{N-m \atopwithdelims ()N-k}t(N-k+1)\right] }{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle \lfloor N/2 \rfloor }} \left[ {N-m \atopwithdelims ()k-1} (t(k)+t(k-1))+{N-m \atopwithdelims ()N-k}(t(N-k+1)+t(N-k))\right] } \end{aligned}$$
(1)
Notice that \({N-m \atopwithdelims ()k-1} = {N-m \atopwithdelims ()N-m-k+1}\). Also, since \(m \le N/2\) and \(k < (N+1)/2\), \({N-m \atopwithdelims ()N-k-m+1} > {N-m \atopwithdelims ()N-k}\). Since \(\gamma \) is symmetric, \(\frac{t(k)+t(N-k+1)}{t(k)+t(k-1)+t(N-k+1)+t(N-k)} = 1/2\). Also, since \(k < N/2+1\) and \(\gamma \) is tighter than the Binomial\((N,1/2)\), conclude that \(\frac{t(k)}{t(k)+t(k-1)} > 1/2\). Hence, in Eq. 1, the ratio between each term in the numerator and each term in the denominator is greater than 1/2.
When \(\gamma \) is looser than the Binomial\((N,1/2)\), \(\frac{t(k)}{t(k)+t(k-1)} < 1/2\). Hence, all the inequalities are reversed. \(\square \)
Lemma 2
If \(\gamma \) is tighter than the Binomial\((N,1/2)\), then the gambler’s belief holds. If \(\gamma \) is looser than the Binomial\((N,1/2)\), then the reverse gambler’s belief holds.
Proof (Lemma 2)
Without loss of generality, assume that the number of \(0\)’s in \(\mathtt x _{n}\) is larger than the number of \(1\)’s. Since the model is exchangeable, for any permutation \(\pi \) of \(\{1,\ldots ,n\}\), \(P(\gamma =\gamma _{0}|\mathtt X _{i}=\mathtt x _{i}) = P(\gamma =\gamma _{0}|\mathtt X _{i}=\mathtt x _{\pi (i)})\). Consider a permutation \(\pi \) and \(\mathtt y = \mathtt x _{\pi }\) such that, for some \(a\), \(\mathtt y _{1}^{a}\) has an equal number of \(0\)’s and \(1\)’s, and \(\mathtt y _{a+1}^{n}\) only has \(0\)’s. Let \(\gamma ^{*} = \sum _{i=a+1}^{N}{X_{i}}\).
$$\begin{aligned} P(X_{n+1}=1|\mathtt x )&= P(X_{n+1}=1|\mathtt y ) \\&= \sum _{i=0}^{N}{P(X_{n+1}=1|\mathtt y _{a+1}^{n},\gamma ^{*}=i)P(\gamma ^{*}=i|\mathtt y _{1}^{a},\mathtt y _{a+1}^{n})}. \end{aligned}$$
That is, \(P(X_{n+1}=1|\mathtt x )\) is equal to \(P(X_{n+1}=1|\mathtt y _{a+1}^{n})\) using \(P(\gamma ^{*}=i|\mathtt y _{1}^{a})\) as a prior for \(\gamma ^{*}\). Observe that
$$\begin{aligned} P(\gamma ^{*}=i|y_{1}^{a})&\propto P(\gamma ^{*}=i,y_{1}^{a}) \\&= P(\gamma =i+a/2) {i+a/2 \atopwithdelims ()a/2}{N-i-a/2 \atopwithdelims ()a/2} \end{aligned}$$
The last equality follows since \(y_{1}^{a}\) has same number of \(1\)’s and \(0\)’s. Hence, if \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\), then \(\gamma ^{*}|y_{1}^{a}\) is tighter (looser) than the Binomial\((N-a,1/2)\). Using \(P(\gamma ^{*}=i|\mathtt y _{1}^{a})\) as a prior for \(\gamma ^{*}\), conclude from Lemma 1 that, if \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\), then \(P(X_{n+1}=1|\mathtt y _{a+1}^{n}) > (<) 1/2\). \(\square \)
Lemma 3
Assume the distribution of \(\gamma \) is symmetric on \(N/2\). If the (reverse) gambler’s belief holds, then \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\).
Proof (Lemma 3)
Let \(\mathtt x _{N-1} \in \{0,1\}^{N-1}\) and \(s_{N-1}-1 = \sum _{i=1}^{N-1}\mathtt{x _{N-1}}\).
$$\begin{aligned}&P(X_{N}=1|\mathtt X _{N-1}=\mathtt x _{N-1}) \\&\quad \quad = \frac{P(X_{N}=1,\mathtt X _{N-1}=\mathtt x _{N-1})}{P(X_{N}=0,\mathtt X _{N-1}=\mathtt x _{N-1})+P(X_{N}=1,\mathtt X _{N-1}=\mathtt x _{N-1})} \\&\quad \quad = \frac{P(\gamma =s_{N-1})/{N \atopwithdelims ()s_{N-1}}}{P(\gamma =s_{N-1}-1)/{N \atopwithdelims ()s_{N-1}-1} +P(\gamma =s_{N-1})/{N \atopwithdelims ()s_{N-1}}} \\&\quad \quad = \frac{1}{1 + \frac{P(\gamma =s_{N-1}-1)}{P(\gamma =s_{N-1})} \frac{N-s_{N-1}+1}{s_{N-1}}}. \end{aligned}$$
If the gambler’s relief holds, then \(P(X_{N}=1|\mathtt X _{N-1}=\mathtt x _{N-1}) > \frac{1}{2}\), for every \(s_{N-1} \le \frac{N}{2}\). Hence, for every \(s_{N-1} \le \frac{N}{2}, \frac{P(\gamma =s_{N-1})}{P(\gamma =s_{N-1}-1)} > \frac{N-s_{N-1}+1}{s_{N-1}}\). Since the distribution of \(\gamma \) is symmetric around \(N/2\), conclude that \(\gamma \) is tighter than the Binomial\((N,1/2)\). Similarly, if the reverse gambler’s belief holds, then \(\frac{P(\gamma =s_{N-1})}{P(\gamma =s_{N-1}-1)} < \frac{N-s_{N-1}+1}{s_{N-1}}\), and \(\gamma \) is looser than the Binomial\((N,1/2)\). \(\square \)
Proof (Theorem 1)
Follows from Lemmas 2 and 3. \(\square \)
Proof (Proposition 3)
Since the distribution of de Finetti’s parameter is exchangeable, the distribution of \(\gamma \) is symmetric with respect to \(N/2\). Hence, it remains to show that \(P(\gamma =i)/P(\gamma =i-1) < (N-i+1)/y\), for \(1 \le i \le N/2\). First, observe that, for every \(0 \le \pi \le 1\) such that \(\pi \ne 0.5\) and \(n \ge 0\), it follows that \((\pi ^{n+1}-(1-\pi )^{n+1})(\pi -(1-\pi )) > 0\). Hence, developing this expression, \(\pi (1-\pi )^{n+1} + (1-\pi )\pi ^{n+1} < \pi ^{n+2} + (1-\pi )^{n+2}\). Hence, for \(i \le N/2\),
$$\begin{aligned} \frac{\pi ^{i}(1-\pi )^{N-i} + (1-\pi )^{i}\pi ^{N-i}}{\pi ^{i-1}(1-\pi )^{N-i+1} + (1-\pi )^{i-1}\pi ^{N-i+1}} < 1 \end{aligned}$$
(2)
Next, since \(\mathtt X _{n}\) can be extended to an infinitely exchangeable sequence \(\mathtt X \), one can apply de Finetti’s representation theorem (Finetti 1931) to \(\gamma \). That is, there exists a distribution \(Q\) on \([0,1]\) such that, for every \(i \in \mathbb {N}, P(\gamma = i) = \int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1-\pi )^{N-i}Q(d\pi )}\). Thus,
$$\begin{aligned} \frac{P(\gamma =i)}{P(\gamma =i-1)}&= \frac{\int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1-\pi )^{N-i}Q(d\pi )}}{\int _{0}^{1}{{N \atopwithdelims ()i-1} \pi ^{i-1}(1-\pi )^{N-i+1}Q(d\pi )}}\\&= \frac{\int _{0}^{0.5}{{N \atopwithdelims ()i} (\pi ^{i}(1-\pi )^{N-i}+((1-\pi )^{i}\pi ^{N-i})Q(d\pi )}}{\int _{0}^{0.5}{{N \atopwithdelims ()i-1} (\pi ^{i-1}(1-\pi )^{N-i+1}+(1-\pi )^{i-1}\pi ^{N-i+1}) Q(d\pi )}}\\&< \frac{{N \atopwithdelims ()i}}{{N \atopwithdelims ()i-1}} = \frac{N-i+1}{i}. \end{aligned}$$
The last inequality follows from Eq. 2. \(\square \)
Proof (Theorem 2)
Let \(M = \min \{i \le N: X_{i}=1\}\) be the first trial in which a success is observed and \(r(m) = P(M=m|M \ge m)\). In order to verify belief in maturity, one must show that \(r(m)\) increases on \(m\). Let \(t(k) = P(\gamma =k)/{N \atopwithdelims ()k}\). Using the same development as in the proof of Lemma 1,
$$\begin{aligned} r(m)&= \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=0}^{\scriptscriptstyle N-m+1}} {N-m+1 \atopwithdelims ()k} t(k)} \nonumber \\&= \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}}{N-m-1 \atopwithdelims ()k-1} t(k)+{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}}{N-m-1 \atopwithdelims ()k-2} t(k)}{{\sum _{\scriptscriptstyle k=0}^{\scriptscriptstyle N-m}} {N-m \atopwithdelims ()k} t(k)+{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}} {N-m \atopwithdelims ()k-1} t(k)}. \end{aligned}$$
(3)
Observe that, in Eq. 3, the first sum in the numerator divided by the first sum in the denominator is equal to \(r(m+1)\). Therefore, to obtain \(r(m+1) > r(m)\), it is sufficient to prove the following:
$$\begin{aligned} \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}}{N-m-1 \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=0}^{\scriptscriptstyle N-m}} {N-m \atopwithdelims ()k} t(k)} > \frac{{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}}{N-m-1 \atopwithdelims ()k-2} t(k)}{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}} {N-m \atopwithdelims ()k-1} t(k)}, \end{aligned}$$
which is equivalent to
$$\begin{aligned} \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}}{N-m-1 \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}} {N-m-1 \atopwithdelims ()k-1} (t(k)+t(k-1))} > \frac{{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}}{N-m-1 \atopwithdelims ()k-2} t(k)}{{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}} {N-m-1 \atopwithdelims ()k-2} (t(k)+t(k-1))}. \end{aligned}$$
(4)
If \(\gamma \) is 2nd-order tighter than the Binomial, for every \(1 \le k \le N-1\), then
$$\begin{aligned} \frac{P(\gamma =k+1)/P(\gamma =k)}{P(\gamma =k)/P(\gamma =k-1)} < \frac{(N-y)/(y+1)}{(N-y+1)/y}. \end{aligned}$$
Hence, for every \(1 \le k \le N-1\),
$$\begin{aligned} \frac{t(k)}{t(k)+t(k-1)} > \frac{t(k+1)}{t(k+1)+t(k)} \end{aligned}$$
(5)
Hence, if \(\gamma \) is 2nd-order tighter than the Binomial, then Eq. 5 holds and, therefore, Eq. 4 also holds. Hence, if \(\gamma \) is 2nd-order tighter than the Binomial, then belief in maturity holds. If \(\gamma \) is 2nd-order looser than the Binomial, then the proof follows by reversing the inequality in Eq. 5. \(\square \)
Proof (Proof of Proposition 5)
Since \(\mathtt X _{n}\) can be extended to an infinitely exchangeable sequence \(\mathtt X \), one can apply de Finetti’s representation theorem (Finetti 1931) to \(\gamma \). That is, there exists a distribution \(Q\) on \([0,1]\) such that, for every \(i \in \mathbb {N}\), \(P(\gamma = i) = \int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1-\pi )^{N-i}Q(d\pi )}\). Hence,
$$\begin{aligned}&\frac{P(\gamma =i)^{2}}{P(\gamma =i+1)P(\gamma =i-1)}\\&\quad \quad = \frac{\left( \int _{0}^{1}{{N \atopwithdelims ()i}\pi ^{i}((1-\pi )^{N-i}Q(d\pi )}\right) ^{2}}{\int _{0}^{1}{{N \atopwithdelims ()i+1}\pi ^{i+1}((1-\pi )^{N-i-1}Q(d\pi )}\int _{0}^{1}{{N \atopwithdelims ()i-1}\pi ^{i-1}((1-\pi )^{N-i+1}Q(d\pi )}}\\&\quad \quad =\frac{E_{Q}[\pi ^{i}(1-\pi )^{N-i}]^{2}}{E_{Q}[\pi ^{i+1}(1-\pi )^{N-i-1}]E_{Q}[\pi ^{i-1}(1-\pi )^{N-i+1}]} \cdot \frac{(i+1)(N-i+1)}{i(N-i)}\\&\quad \quad < \frac{(i+1)(N-i+1)}{i(N-i)}. \end{aligned}$$
The last line follows from the Cauchy–Schwarz inequality. \(\square \)