On atom-swarming and Luce’s theorem for probabilistic beliefs

For qualitative probability spaces, monotone continuity and third-order atom-swarming are together sufficient for a unique countably additive probability measure representation that may have atoms (Mackenzie in Theor Econ 14:709–778, 2019). We provide a new proof by appealing to a theorem of Luce (Ann Math Stat 38:780–786, 1967), highlighting the usefulness of extensive measurement theory (Krantz et al. in Foundations of Measurement Volume I: Additive and Polynomial Representations. Academic Press, New York, 1971) for economists.


Introduction
Under what conditions may we represent beliefs about the relative likelihood of events using probabilities? At a high level, the basic axioms of qualitative probability are not sufficient (Kraft et al. 1959), the most prominent sufficient conditions imply that there are no atoms (Savage 1954;Villegas 1964), and the necessary and sufficient cancellation conditions are relatively complex (Chateauneuf 1985). That said, solvability (Luce 1967) and atom-swarming (Mackenzie 2019) offer two approaches for allowing atoms with simpler sufficient conditions. In this article, we investigate the relationship between these approaches by examining the logical connections between the following results for qualitative probabilities: • monotone continuity guarantees that any probability measure representation is countably additive (Theorem V; Villegas 1964), • Archimedeanity and solvability guarantee a probability measure representation (Theorem L;Luce 1967), and • monotone continuity and third-order atom-swarming guarantee a countably additive probability measure representation (Theorem M; Mackenzie 2019).
In particular, we provide a new proof of Theorem M by proving that its hypotheses imply the hypotheses of Theorem L, then applying Theorem L and Theorem V. We compare the two proofs, highlighting the usefulness of extensive measurement theory (Krantz et al. 1971) for economists.

Model
A qualitative probability space is an ordered σ -algebra satisfying some basic probabilistic properties (Bernstein 1917;de Finetti 1937;Koopman 1940;Savage 1954): Definition 1 Let (A, ⊇) be a σ -algebra with maximum S and minimum ∅, and let be a binary relation on A. We say that (A, ⊇, ) is a qualitative probability space if and only if • is complete and transitive; We refer to members of A as events, we refer to the comparisons in as beliefs, and we interpret A B to mean that A is at least as likely as B. Whenever we refer to a generic qualitative probability space, we assume all of the above notation.
We are interested in axioms that guarantee that beliefs about the relative likelihood of events may be represented by probabilities: Definition 2 Fix a qualitative probability space. A probability measure μ : A → [0, 1] is a representation of if and only if for each pair A, B ∈ A, In particular, we are interested in such axioms that allow for atoms: Definition 3 Fix a qualitative probability space. We say that an event α ∈ A is an atom if and only if • α ∅; and • for each B ∈ A such that B ⊆ α, we have B ∼ α or B ∼ ∅.
We let A • denote the collection of atoms.

Axioms and previous results
The following axioms are focal to our analysis: Definition 4 A qualitative probability space satisfies • monotone continuity if and only if for each A ∈ A and each (B i ) ∈ A N such that • solvability if and only if for each four A, B, C, D ∈ A such that A ∩ B = ∅, • third-order atom-swarming if and only if for each α ∈ A • , there are I ⊆ N, pairwise-disjoint {B i } i∈I ⊆ A, and I 1 , I 2 , I 3 partitioning I such that (i) for each i ∈ I , α B i , and (ii) ∪ I 1 B i α, ∪ I 2 B i α, and ∪ I 3 B i α.
The first axiom (due to Villegas 1964) is the standard continuity notion for qualitative probability spaces. The second axiom (due to Luce 1967) loosely states that if A is non-null, and if each A i crudely acts as the union of i pairwise-disjoint events that are each equivalent to A, then there must be a maximum i with such an A i ; we remark that this axiom is necessary for probability measure representation. The third axiom (also due to Luce 1967) states that if a disjoint pair of events (A, B) dominates another pair of events (C, D), then the dominated pair is equivalent to a disjoint pair (C , D ), that is contained in an event E, that is equivalent to the union of the dominating pair. The final axiom (due to Mackenzie 2019) loosely states that if there are any atoms, then each is sufficiently swarmed by less-likely events.
The following three theorems concern these axioms and their implications for probability measure representations: Theorem V (Villegas 1964) If a qualitative probability space satisfies monotone continuity, then any probability measure representation is countably additive.
Theorem L (Luce 1967) If a qualitative probability space satisfies Archimedeanity and solvability, then it has a unique probability measure representation.
Theorem M (Mackenzie 2019) If a qualitative probability space satisfies monotone continuity and third-order atom-swarming, then it has a unique countably additive probability measure representation.

Overview and discussion of new proof
Our contribution is to provide a new proof of Theorem M by exploring the logical relationships between monotone continuity, Archimedeanity, solvability, and thirdorder atom-swarming. In particular, we prove that the hypotheses of Theorem M imply the hypotheses of Theorem L, then apply Theorem L and Theorem V to conclude. See the Appendix for the proof.
Remarkably, the new proof involves essentially all the same techniques as the original, and even after the arguments diverge they share some high level similarities. In particular, the original proof involves the construction of a supercabinet-a structured order-dense family of equivalence classes-and verifying that this construction satisfies all requirements involves a binary operation defined on certain pairs of representative members of the equivalence classes. In the new proof, we begin constructing the same supercabinet, but instead of completing and verifying the construction, we instead use the partial construction to prove that Luce's axioms hold; Luce's arguments then establish that the family of all non-null equivalence classes, together with a binary operation defined on certain pairs, form an extensive system (without a maximal element) (Behrend 1955;Luce and Marley 1969). Though supercabinets and extensive systems are distinct structures, their associated binary operations are both closely related to the union of disjoint pairs of events. 1 Despite the similarity between these structures, they were designed with different motivations. In particular, • Supercabinets were developed specifically for guaranteeing countably additive probability measure representations. In particular, it is a classic result from utility theory that if a completely pre-ordered set has an order-dense family of equivalence classes, then for each topology for which upper and lower contour sets are closed, there is a continuous representation (Cantor 1895;Debreu 1954Debreu , 1964. A supercabinet is such a family with additional structure that guarantees the continuous representation is moreover a countably additive probability measure, and thus respects the algebraic structure of the events. • Extensive systems were developed in the context of the broader theory of measuring extensive attributes, such as length and mass. The general problem is to assign magnitudes to objects in a manner that respects the concatenation of pairs of objects, such as (i) assigning lengths to rods in a manner that respects the operation of adjoining two rods end-to-end, or (ii) assigning masses to collections of elements using a balance. Extensive systems were designed for problems where certain pairs cannot be concatenated, such as when mass must be measured with a fixed balance that cannot support collections of elements that are extremely large or massive.
Though utility theory is familiar to most economists, extensive measurement theory can be just as useful for the same problems, and our new proof provides a novel illustration of this point. For an excellent overview of extensive measurement theory, see Krantz et al. (1971).
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Appendix
The original proof of Theorem M involved showing that it is without loss of generality to restrict attention to qualitative probability spaces with a unique null event, which makes the unambiguous construction of events considerably more convenient, and we leave these arguments unchanged. To be precise, we provide a new proof of the following proposition:

Proposition M (Mackenzie 2019) If a qualitative probability space with a unique null event satisfies monotone continuity and third-order atom-swarming, then it has a unique countably additive probability measure representation.
To do so, we heavily use techniques and lemmas introduced in Mackenzie (2019), freely using all associated notation without reintroduction for brevity.
The new proof consists of five steps. In the first two steps, we construct events that 'should' be assigned certain probabilities and establish that they satisfy associated properties; as this is done in Mackenzie (2019), we simply sketch the key arguments here. In the original proof, these events were used to construct a supercabinet; here, they are used to establish Archimedeanity (Step 3). We then establish solvability using greedy transforms (Step 4), and finally conclude by applying Theorem L and Theorem V (Step 5).

Proposition M (Repeated) If a qualitative probability space with a unique null event satisfies monotone continuity and third-order atom-swarming, then it has a unique countably additive probability measure representation.
Proof Let (A, ⊇, ) satisfy the hypotheses.
This is established in the Supercabinet Construction Lemma of Mackenzie (2019), primarily in Step 1 and Step 8; we refer the reader there for all details. The notation above is the same used in those arguments, and is meant to suggest that A 1 q is an event that 'should' be assigned the probability 1 2 q . We simply summarize the key arguments here.
We say that a non-null event is n-divisible if and only if the σ -algebra of its subevents satisfies n-AS. First, we prove that each non-null and 1-divisible event A contains a 'half event,' or an event H ⊆ A that is as likely as A\H . (We also prove that all such half events are equally likely.) To do so, we construct such a half event by arranging the atoms in A in descending order of likelihood, iteratively including atoms unless the constructed event will become more likely than its relative complement, and then carving out as much likelihood as possible (without making the construction more likely than its relative complement) from the largest subevent of A that contains no atoms.
We also prove that given an n-divisible event A (the 'parent') and a less likely event B (the 'target'), we can construct a subevent of A, B ⊆ A, such that (i) B ∼ B, and (ii) A\B is (n − 1)-divisible. To prove this, we construct such a subevent using a procedure analogous to the one described above, except that this time we take care to make sure that the construction is never more likely than the target event (instead of its relative complement). We refer to this procedure as the 'greedy transform' for A, denoting the output by G A (B). 2 These first observations about greedy transforms suggest a natural generalization: given an n-divisible parent event A, we can accept n targets and iteratively apply greedy transforms to construct n pairwise-disjoint subevents of A that are as likely as the targets, provided they do not prematurely exhaust the parent. We prove that given a sequence of lists of n targets that is monotonic-in that the first targets are monotonic in likelihood, the second targets are monotonic in likelihood (though perhaps in the opposite direction), and so on-the associated sequence of lists of n outputs is convergent-in that the outputs of the first targets are convergent, the outputs of the second targets are convergent, and so on.
Altogether, these techniques allow us to complete the current step using only 2-AS. For each A 1 q , the event S\G S (S\A 1 q ) is 1-divisible, and thus contains the desired halves (H (A 1  q )) and repeat. To prove that (A 1 q ) is convergent with null limit, we take the sequence of lists of two targets (A 1 q , A 1 q ) and use greedy transforms to construct two convergent sequences (B q ) and (C q ) whose members are always disjoint and each as likely as the given A 1 q . It is easy to see that we always have B q+1 ∪ C q+1 ∼ A 1 q , and it follows that lim . This is established in the Supercabinet Construction Lemma of Mackenzie (2019) in Step 2, Step 3, and Step 4. For each q ∈ {0, 1, . . .}, we define A 0 q ≡ ∅, and for each p ∈ {0, 1, . . . , 2 q − 1} we define Due to properties of the greedy transforms, this yields the same {A 1 q } defined before. We refer the reader to the original proof for details, remarking only that these arguments involve 3-AS. Assume for contradiction that Archimedeanity is violated. Then there are A * ∈ A such that A * ∅ and (A * i ) ∈ A N such that for each i ∈ N, there are B Step 1, there is q ∈ {0, 1, . . .} such that A * A 1 q , so A * 1 A 1 q . By Step 2, for each p ∈ {1, 2, . . . , 2 q − 1} such that A * Thus we have A * 2 q A 2 q q . But by Step 1 and Step 2, A 2 q q = A 1 0 ∼ S, so A * 2 q S, contradicting monotonicity. It follows directly from 3-AS and our observations about greedy transforms that C , D , E satisfy the desired properties. • Step 5: Conclude. By Step 3 and Step 4, we can apply Theorem L; thus there is a unique probability measure representation. By Theorem V, this representation is countably additive.