Kolmogorov's account in his  of an absolute probability space presupposes given a Boolean algebra, and so does Rényi's account in his  and  of a relative probability space. Anxious to prove probability theory ‘autonomous’. Popper supplied in his  and  accounts of probability spaces of which Boolean algebras are not and  accounts of probability spaces of which fields are not prerequisites but byproducts instead.1 I review the accounts in question, showing how Popper's issue from and how (...) they differ from Kolmogorov's and Rényi's, and I examine on closing Popper's notion of ‘autonomous independence’. So as not to interrupt the exposition, I allow myself in the main text but a few proofs, relegating others to the Appendix and indicating as I go along where in the literature the rest can be found. (shrink)
Teddy Seidenfeld recently claimed that Kolmogorov's probability theory transgresses the Substitutivity Law. Underscoring the seriousness of Seidenfeld's charge, the author shows that (Popper's version of) the law, to wit: If (∀ D)(Pr(B,D)=Pr(C,D)), then Pr(A,B)=Pr(A,C), follows from just C1. 0≤ Pr(A,B)≤ 1 C2. Pr(A,A)=1 C3. Pr(A & B,C)=Pr(A,B & C)× Pr(B,C) C4. Pr(A & B,C)=Pr(B & A,C) C5. Pr(A,B & C)=Pr(A,C & B), five constraints on Pr of the most elementary and most basic sort.
Consider a language SL having as its primitive signs one or more atomic statements, the two connectives ‘∼’ and ‘&,’ and the two parentheses ‘’; and presume the extra connectives ‘V’ and ‘≡’ defined in the customary manner. With the statements of SL substituting for sets, and the three connectives ‘∼,’ ‘&,’and ‘V’ substituting for the complementation, intersection, and union signs, the constraints that Kolmogorov places in  on probability functions come to read:K1. 0 ≤ P,K2. P) = 1,K3. If (...) ⊦ ∼, then P = P + P,K4. If ⊦ A ≡ B, then P = P.2. (shrink)