Skip to main content
Log in

Exchangeability and the law of maturity

  • Published:
Theory and Decision Aims and scope Submit manuscript

Abstract

The law of maturity is the belief that less-observed events are becoming mature and, therefore, more likely to occur in the future. Previous studies have shown that the assumption of infinite exchangeability contradicts the law of maturity. In particular, it has been shown that infinite exchangeability contradicts probabilistic descriptions of the law of maturity such as the gambler’s belief and the belief in maturity. We show that the weaker assumption of finite exchangeability is compatible with both the gambler’s belief and belief in maturity. We provide sufficient conditions under which these beliefs hold under finite exchangeability. These conditions are illustrated with commonly used parametric models.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Bernardo, J., & Smith, A. (1994). Bayesian theory. Wiley Series in Probability and Mathematical Statistics. New York: Wiley.

    Google Scholar 

  • Brooks, R. J., James, W. H., & Gray, E. (1991). Modelling sub-binomial variation in the frequency of sex combinations in litters of pigs. Biometrics, 47, 403–417.

    Article  Google Scholar 

  • de Finetti, B. (1931). Funzione caratteristica di un fenomeno aleatorio. Atti della R Academia Nazionale del Linceo, 6, 251–299.

    Google Scholar 

  • Diniz, C. A., Tutia, M. H., Leite, J. G., et al. (2010). Bayesian analysis of a correlated binomial model. Brazilian Journal of Probability and Statistics, 24(1), 68–77.

    Article  Google Scholar 

  • Iglesias, P., Loschi, R., Pereira, C., & Wechsler, S. (2009). A note on extendibility and predictivistic inference in finite populations. Brazilian Journal of Probability and Statistics, 23(2), 216–226.

    Article  Google Scholar 

  • Kadane, J. B. (2014). Sums of possibly associated bernoulli variables: The conway-maxwell-binomial distribution. arXiv: http://arxiv.org/abs/14041856.

  • Kalra, A., & Shi, M. (2010). Consumer value-maximizing sweepstakes and contests. Journal of Marketing Research, 47(2), 287–300.

    Article  Google Scholar 

  • Lee, J., & Lio, Y. (1999). A note on bayesian estimation and prediction for the beta-binomial model. Journal of Statistical Computation and Simulation, 63(1), 73–91.

    Article  Google Scholar 

  • Lindley, D., & Phillips, L. (1976). Inference for a Bernoulli process (a Bayesian view). The American Statistician, 30(3), 112–119.

    Google Scholar 

  • Mendel, M. (1994). Operational parameters in Bayesian models. Test, 3(2), 195–206.

    Article  Google Scholar 

  • Militana, E., Wolfson, E., & Cleaveland, J. (2010). An effect of inter-trial duration on the gambler’s fallacy choice bias. Behavioural Processes, 84(1), 455–459.

    Article  Google Scholar 

  • O’Neill, B., & Puza, B. (2005). In defence of the reverse gambler’s belief. Mathematical Scientist, 30(1), 13–16.

    Google Scholar 

  • Oppenheimer, D., & Monin, B. (2009). The retrospective gambler’s fallacy: Unlikely events, constructing the past, and multiple universes. Judgment and Decision Making, 4(5), 326–334.

    Google Scholar 

  • Rabin, M., & Vayanos, D. (2010). The gambler’s and hot-hand fallacies: Theory and applications. Review of Economic Studies, 77(2), 730–778.

    Article  Google Scholar 

  • Rodrigues, F., & Wechsler, S. (1993). A discrete Bayes explanation of a failure-rate paradox. IEEE Transactions on Reliability, 42(1), 132–133.

    Article  Google Scholar 

  • Shmueli, G., Minka, T. P., Kadane, J. B., Borle, S., & Boatwright, P. (2005). A useful distribution for fitting discrete data: revival of the Conway–Maxwell–Poisson distribution. Journal of the Royal Statistical Society: Series C (Applied Statistics), 54(1), 127–142.

    Article  Google Scholar 

Download references

Acknowledgments

Partially supported by CNPq and FAPESP (2003/10105-2). The authors thank Dani Gamerman, Jay Kadane, Carlos Pereira, Teddy Seidenfeld, Julio Stern and Robert Winkler for insightful remarks.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rafael B. Stern.

Appendix: Proofs

Appendix: Proofs

Proof (Proposition 1)

Consider that the coordinates of \(\mathtt X _{N}\) are jointly independent. Since \(\mathtt X _{N}\) is finitely exchangeable, the coordinates of \(\mathtt X _{N}\) are identically distributed. Thus, since \(\gamma = \sum _{i=1}^{N}{X_{i}}\) and \(X_{i}\) are i.i.d., conclude that \(\gamma \sim \text {Binomial}(N,P(X_{1}=1))\). Hence, under the assumption of independence, there exists \(\pi \in [0,1]\) such that \(\gamma \sim \text {Binomial}(N,\pi )\).

Also observe that, since \(\mathtt X _{N}\) is exchangeable, the distribution of \(\mathtt X _{N}\) is completely specified by the distribution of \(\gamma \). Hence, there exists a unique distribution on \(\mathtt X _{N}\) for each distribution on \(\gamma \). Conclude from the last paragraph that, if \(\gamma \sim \text {Binomial}(N,\pi )\), then the coordinates of \(\mathtt X _{N}\) are independent.

The proof of Proposition 1 follows from the implications proved in the two previous paragraphs. \(\square \)

Proof (Propostion 2)

In order to prove Proposition 2 we first show that the statement that \(\mathtt X _{N}\) models indifferent belief is equivalent to the joint independence of the coordinates of \(\mathtt X _{N}\). Observe that, by definition, the statement that \(\mathtt X _{N}\) models indifferent belief implies that the coordinates of \(\mathtt X _{N}\) are jointly independent. Also, since \(\mathtt X _{N}\) is finitely exchangeable, \(P(X_{i} = 1) = P(X_{j} = 1)\). Hence, joint independence of the coordinates of \(\mathtt X _{N}\) implies that \(\mathtt X _{N}\) models indifferent belief.

The proof of Proposition 2 follows from the equivalence that is proved in the previous paragraph and the direct application of Proposition 1. \(\square \)

Lemma 1

Let \(M = \min \{i \le N: X_{i}=1\}\) be the first trial in which a success is observed. If \(\gamma \) is tighter than the Binomial(\(N\), 1/2), then

$$\begin{aligned} P(M=m|M \ge m)&> 1/2,\quad \text{ for }\quad m=2,\ldots , N \end{aligned}$$

Similarly, if \(\gamma \) is looser than the Binomial(\(N, 1/2\)), then

$$\begin{aligned} P(M=m|M \ge m)&< 1/2, \quad \text{ for } \quad m=2,\ldots , N \end{aligned}$$

Proof (Lemma 1)

Let \(t(k) = P(\gamma = k)/{N \atopwithdelims ()k}\). Observe that

$$\begin{aligned} P(M=m)={\displaystyle \sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1}t(k). \end{aligned}$$

Hence,

$$\begin{aligned}&P(M=m|M \ge m) = \frac{\sum _{k=1}^{N-m+1}{{N-m \atopwithdelims ()k-1}t(k)}}{t(0)+ \sum _{k=1}^{N-m+1}{\sum _{i=m}^{N-k+1}{{N-i \atopwithdelims ()k-1} t(k)}}}\\&=\frac{{\sum _{k=1}^{N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{t(0)+ {\sum _{k=1}^{N-m+1}} {N-m+1 \atopwithdelims ()k} t(k) } = \frac{{\sum _{k=1}^{N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{{\sum _{k=1}^{N-m+1}} {N-m \atopwithdelims ()k-1} (t(k)+t(k-1))}\\&= \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle min(m-1,N-m\!+1)}}{N-m \atopwithdelims ()k-1} t(k) \!+ \left[ {\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1} t(k)\right] \mathbf {I}_{(m < \ N/2 + 1)} }{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle min(m-1,N-m+1)}} {N-m \atopwithdelims ()k-1} (t(k)+t(k-1)) \!+\! \left[ {\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m\!+\!1}} {N-m \atopwithdelims ()k-1} (t(k)\!+\!t(\text{ k-1 }))\right] \mathbf {I}_{(m < \ N/2 + 1)}}. \end{aligned}$$

Consider the case in which \(\gamma \) is tighter than the Binomial\((N,1/2)\). In order to prove the lemma, it is sufficient to show the following: (1) the first sum in the numerator divided by the first sum in the denominator is greater than 1/2, and (2) if \(m < N/2+1\), then the second sum in the numerator divided by the second sum in the denominator is greater than 1/2.

  1. (1)

    If \(k < (N+1)/2\), since \(\gamma \) is tighter than the Binomial\((N,1/2)\), then conclude that \(\frac{t(k)}{t(k)+t(k-1)} > 1/2\). Hence, for every \(1 \le k \le \min (m-1,N-m+1)\), \(\frac{{N-m \atopwithdelims ()k-1}t(k)}{{N-m \atopwithdelims ()k-1}(t(k)+t(k-1))} > 1/2\).

  2. (2)

    Since \(m < N/2+1\),

    $$\begin{aligned}&\frac{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle N-m+1}} {N-m \atopwithdelims ()k-1} (t(k)+t(k-1)) } \nonumber \\&\quad = \frac{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle \lfloor N/2 \rfloor }}\left[ {N-m \atopwithdelims ()k-1} t(k)+{N-m \atopwithdelims ()N-k}t(N-k+1)\right] }{{\sum _{\scriptscriptstyle k=m}^{\scriptscriptstyle \lfloor N/2 \rfloor }} \left[ {N-m \atopwithdelims ()k-1} (t(k)+t(k-1))+{N-m \atopwithdelims ()N-k}(t(N-k+1)+t(N-k))\right] } \end{aligned}$$
    (1)

Notice that \({N-m \atopwithdelims ()k-1} = {N-m \atopwithdelims ()N-m-k+1}\). Also, since \(m \le N/2\) and \(k < (N+1)/2\), \({N-m \atopwithdelims ()N-k-m+1} > {N-m \atopwithdelims ()N-k}\). Since \(\gamma \) is symmetric, \(\frac{t(k)+t(N-k+1)}{t(k)+t(k-1)+t(N-k+1)+t(N-k)} = 1/2\). Also, since \(k < N/2+1\) and \(\gamma \) is tighter than the Binomial\((N,1/2)\), conclude that \(\frac{t(k)}{t(k)+t(k-1)} > 1/2\). Hence, in Eq. 1, the ratio between each term in the numerator and each term in the denominator is greater than 1/2.

When \(\gamma \) is looser than the Binomial\((N,1/2)\), \(\frac{t(k)}{t(k)+t(k-1)} < 1/2\). Hence, all the inequalities are reversed. \(\square \)

Lemma 2

If \(\gamma \) is tighter than the Binomial\((N,1/2)\), then the gambler’s belief holds. If \(\gamma \) is looser than the Binomial\((N,1/2)\), then the reverse gambler’s belief holds.

Proof (Lemma 2)

Without loss of generality, assume that the number of \(0\)’s in \(\mathtt x _{n}\) is larger than the number of \(1\)’s. Since the model is exchangeable, for any permutation \(\pi \) of \(\{1,\ldots ,n\}\), \(P(\gamma =\gamma _{0}|\mathtt X _{i}=\mathtt x _{i}) = P(\gamma =\gamma _{0}|\mathtt X _{i}=\mathtt x _{\pi (i)})\). Consider a permutation \(\pi \) and \(\mathtt y = \mathtt x _{\pi }\) such that, for some \(a\), \(\mathtt y _{1}^{a}\) has an equal number of \(0\)’s and \(1\)’s, and \(\mathtt y _{a+1}^{n}\) only has \(0\)’s. Let \(\gamma ^{*} = \sum _{i=a+1}^{N}{X_{i}}\).

$$\begin{aligned} P(X_{n+1}=1|\mathtt x )&= P(X_{n+1}=1|\mathtt y ) \\&= \sum _{i=0}^{N}{P(X_{n+1}=1|\mathtt y _{a+1}^{n},\gamma ^{*}=i)P(\gamma ^{*}=i|\mathtt y _{1}^{a},\mathtt y _{a+1}^{n})}. \end{aligned}$$

That is, \(P(X_{n+1}=1|\mathtt x )\) is equal to \(P(X_{n+1}=1|\mathtt y _{a+1}^{n})\) using \(P(\gamma ^{*}=i|\mathtt y _{1}^{a})\) as a prior for \(\gamma ^{*}\). Observe that

$$\begin{aligned} P(\gamma ^{*}=i|y_{1}^{a})&\propto P(\gamma ^{*}=i,y_{1}^{a}) \\&= P(\gamma =i+a/2) {i+a/2 \atopwithdelims ()a/2}{N-i-a/2 \atopwithdelims ()a/2} \end{aligned}$$

The last equality follows since \(y_{1}^{a}\) has same number of \(1\)’s and \(0\)’s. Hence, if \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\), then \(\gamma ^{*}|y_{1}^{a}\) is tighter (looser) than the Binomial\((N-a,1/2)\). Using \(P(\gamma ^{*}=i|\mathtt y _{1}^{a})\) as a prior for \(\gamma ^{*}\), conclude from Lemma 1 that, if \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\), then \(P(X_{n+1}=1|\mathtt y _{a+1}^{n}) > (<) 1/2\). \(\square \)

Lemma 3

Assume the distribution of \(\gamma \) is symmetric on \(N/2\). If the (reverse) gambler’s belief holds, then \(\gamma \) is tighter (looser) than the Binomial\((N,1/2)\).

Proof (Lemma 3)

Let \(\mathtt x _{N-1} \in \{0,1\}^{N-1}\) and \(s_{N-1}-1 = \sum _{i=1}^{N-1}\mathtt{x _{N-1}}\).

$$\begin{aligned}&P(X_{N}=1|\mathtt X _{N-1}=\mathtt x _{N-1}) \\&\quad \quad = \frac{P(X_{N}=1,\mathtt X _{N-1}=\mathtt x _{N-1})}{P(X_{N}=0,\mathtt X _{N-1}=\mathtt x _{N-1})+P(X_{N}=1,\mathtt X _{N-1}=\mathtt x _{N-1})} \\&\quad \quad = \frac{P(\gamma =s_{N-1})/{N \atopwithdelims ()s_{N-1}}}{P(\gamma =s_{N-1}-1)/{N \atopwithdelims ()s_{N-1}-1} +P(\gamma =s_{N-1})/{N \atopwithdelims ()s_{N-1}}} \\&\quad \quad = \frac{1}{1 + \frac{P(\gamma =s_{N-1}-1)}{P(\gamma =s_{N-1})} \frac{N-s_{N-1}+1}{s_{N-1}}}. \end{aligned}$$

If the gambler’s relief holds, then \(P(X_{N}=1|\mathtt X _{N-1}=\mathtt x _{N-1}) > \frac{1}{2}\), for every \(s_{N-1} \le \frac{N}{2}\). Hence, for every \(s_{N-1} \le \frac{N}{2}, \frac{P(\gamma =s_{N-1})}{P(\gamma =s_{N-1}-1)} > \frac{N-s_{N-1}+1}{s_{N-1}}\). Since the distribution of \(\gamma \) is symmetric around \(N/2\), conclude that \(\gamma \) is tighter than the Binomial\((N,1/2)\). Similarly, if the reverse gambler’s belief holds, then \(\frac{P(\gamma =s_{N-1})}{P(\gamma =s_{N-1}-1)} < \frac{N-s_{N-1}+1}{s_{N-1}}\), and \(\gamma \) is looser than the Binomial\((N,1/2)\). \(\square \)

Proof (Theorem 1)

Follows from Lemmas 2 and 3. \(\square \)

Proof (Proposition 3)

Since the distribution of de Finetti’s parameter is exchangeable, the distribution of \(\gamma \) is symmetric with respect to \(N/2\). Hence, it remains to show that \(P(\gamma =i)/P(\gamma =i-1) < (N-i+1)/y\), for \(1 \le i \le N/2\). First, observe that, for every \(0 \le \pi \le 1\) such that \(\pi \ne 0.5\) and \(n \ge 0\), it follows that \((\pi ^{n+1}-(1-\pi )^{n+1})(\pi -(1-\pi )) > 0\). Hence, developing this expression, \(\pi (1-\pi )^{n+1} + (1-\pi )\pi ^{n+1} < \pi ^{n+2} + (1-\pi )^{n+2}\). Hence, for \(i \le N/2\),

$$\begin{aligned} \frac{\pi ^{i}(1-\pi )^{N-i} + (1-\pi )^{i}\pi ^{N-i}}{\pi ^{i-1}(1-\pi )^{N-i+1} + (1-\pi )^{i-1}\pi ^{N-i+1}} < 1 \end{aligned}$$
(2)

Next, since \(\mathtt X _{n}\) can be extended to an infinitely exchangeable sequence \(\mathtt X \), one can apply de Finetti’s representation theorem (Finetti 1931) to \(\gamma \). That is, there exists a distribution \(Q\) on \([0,1]\) such that, for every \(i \in \mathbb {N}, P(\gamma = i) = \int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1-\pi )^{N-i}Q(d\pi )}\). Thus,

$$\begin{aligned} \frac{P(\gamma =i)}{P(\gamma =i-1)}&= \frac{\int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1-\pi )^{N-i}Q(d\pi )}}{\int _{0}^{1}{{N \atopwithdelims ()i-1} \pi ^{i-1}(1-\pi )^{N-i+1}Q(d\pi )}}\\&= \frac{\int _{0}^{0.5}{{N \atopwithdelims ()i} (\pi ^{i}(1-\pi )^{N-i}+((1-\pi )^{i}\pi ^{N-i})Q(d\pi )}}{\int _{0}^{0.5}{{N \atopwithdelims ()i-1} (\pi ^{i-1}(1-\pi )^{N-i+1}+(1-\pi )^{i-1}\pi ^{N-i+1}) Q(d\pi )}}\\&< \frac{{N \atopwithdelims ()i}}{{N \atopwithdelims ()i-1}} = \frac{N-i+1}{i}. \end{aligned}$$

The last inequality follows from Eq. 2. \(\square \)

Proof (Theorem 2)

Let \(M = \min \{i \le N: X_{i}=1\}\) be the first trial in which a success is observed and \(r(m) = P(M=m|M \ge m)\). In order to verify belief in maturity, one must show that \(r(m)\) increases on \(m\). Let \(t(k) = P(\gamma =k)/{N \atopwithdelims ()k}\). Using the same development as in the proof of Lemma 1,

$$\begin{aligned} r(m)&= \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}}{N-m \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=0}^{\scriptscriptstyle N-m+1}} {N-m+1 \atopwithdelims ()k} t(k)} \nonumber \\&= \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}}{N-m-1 \atopwithdelims ()k-1} t(k)+{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}}{N-m-1 \atopwithdelims ()k-2} t(k)}{{\sum _{\scriptscriptstyle k=0}^{\scriptscriptstyle N-m}} {N-m \atopwithdelims ()k} t(k)+{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}} {N-m \atopwithdelims ()k-1} t(k)}. \end{aligned}$$
(3)

Observe that, in Eq. 3, the first sum in the numerator divided by the first sum in the denominator is equal to \(r(m+1)\). Therefore, to obtain \(r(m+1) > r(m)\), it is sufficient to prove the following:

$$\begin{aligned} \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}}{N-m-1 \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=0}^{\scriptscriptstyle N-m}} {N-m \atopwithdelims ()k} t(k)} > \frac{{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}}{N-m-1 \atopwithdelims ()k-2} t(k)}{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m+1}} {N-m \atopwithdelims ()k-1} t(k)}, \end{aligned}$$

which is equivalent to

$$\begin{aligned} \frac{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}}{N-m-1 \atopwithdelims ()k-1} t(k)}{{\sum _{\scriptscriptstyle k=1}^{\scriptscriptstyle N-m}} {N-m-1 \atopwithdelims ()k-1} (t(k)+t(k-1))} > \frac{{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}}{N-m-1 \atopwithdelims ()k-2} t(k)}{{\sum _{\scriptscriptstyle k=2}^{\scriptscriptstyle N-m+1}} {N-m-1 \atopwithdelims ()k-2} (t(k)+t(k-1))}. \end{aligned}$$
(4)

If \(\gamma \) is 2nd-order tighter than the Binomial, for every \(1 \le k \le N-1\), then

$$\begin{aligned} \frac{P(\gamma =k+1)/P(\gamma =k)}{P(\gamma =k)/P(\gamma =k-1)} < \frac{(N-y)/(y+1)}{(N-y+1)/y}. \end{aligned}$$

Hence, for every \(1 \le k \le N-1\),

$$\begin{aligned} \frac{t(k)}{t(k)+t(k-1)} > \frac{t(k+1)}{t(k+1)+t(k)} \end{aligned}$$
(5)

Hence, if \(\gamma \) is 2nd-order tighter than the Binomial, then Eq. 5 holds and, therefore, Eq. 4 also holds. Hence, if \(\gamma \) is 2nd-order tighter than the Binomial, then belief in maturity holds. If \(\gamma \) is 2nd-order looser than the Binomial, then the proof follows by reversing the inequality in Eq. 5. \(\square \)

Proof (Proof of Proposition 5)

Since \(\mathtt X _{n}\) can be extended to an infinitely exchangeable sequence \(\mathtt X \), one can apply de Finetti’s representation theorem (Finetti 1931) to \(\gamma \). That is, there exists a distribution \(Q\) on \([0,1]\) such that, for every \(i \in \mathbb {N}\), \(P(\gamma = i) = \int _{0}^{1}{{N \atopwithdelims ()i} \pi ^{i}(1-\pi )^{N-i}Q(d\pi )}\). Hence,

$$\begin{aligned}&\frac{P(\gamma =i)^{2}}{P(\gamma =i+1)P(\gamma =i-1)}\\&\quad \quad = \frac{\left( \int _{0}^{1}{{N \atopwithdelims ()i}\pi ^{i}((1-\pi )^{N-i}Q(d\pi )}\right) ^{2}}{\int _{0}^{1}{{N \atopwithdelims ()i+1}\pi ^{i+1}((1-\pi )^{N-i-1}Q(d\pi )}\int _{0}^{1}{{N \atopwithdelims ()i-1}\pi ^{i-1}((1-\pi )^{N-i+1}Q(d\pi )}}\\&\quad \quad =\frac{E_{Q}[\pi ^{i}(1-\pi )^{N-i}]^{2}}{E_{Q}[\pi ^{i+1}(1-\pi )^{N-i-1}]E_{Q}[\pi ^{i-1}(1-\pi )^{N-i+1}]} \cdot \frac{(i+1)(N-i+1)}{i(N-i)}\\&\quad \quad < \frac{(i+1)(N-i+1)}{i(N-i)}. \end{aligned}$$

The last line follows from the Cauchy–Schwarz inequality. \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bonassi, F.V., Stern, R.B., Peixoto, C.M. et al. Exchangeability and the law of maturity. Theory Decis 78, 603–615 (2015). https://doi.org/10.1007/s11238-014-9441-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11238-014-9441-4

Keywords

Navigation