Towards an axiomatic approach to truth discovery

The problem of truth discovery, i.e., of trying to find the true facts concerning a number of objects based on reports from various information sources of unknown trustworthiness, has received increased attention recently. The problem is made interesting by the fact that the relative believability of facts depends on the trustworthiness of their sources, which in turn depends on the believability of the facts the sources report. Several algorithms for truth discovery have been proposed, but their evaluation has mainly been performed experimentally by computing accuracy against large datasets. Furthermore, it is often unclear how these algorithms behave on an intuitive level. In this paper we take steps towards a framework for truth discovery which allows comparison and evaluation of algorithms based instead on their theoretical properties. To do so we pose truth discovery as a social choice problem, and formulate various axioms that any reasonable algorithm should satisfy. Along the way we provide an axiomatic characterisation of the baseline ‘Voting’ algorithm—which leads to an impossibility result showing that a certain combination of the axioms cannot hold simultaneously—and check which axioms a particular well-known algorithm satisfies. We find that, surprisingly, our more fundamental axioms do not hold, and propose modifications to the algorithms to partially fix these problems.


Introduction
There is an increasing amount of data available in today's world, particularly from the web, social media platforms and crowdsourcing systems. The openness of such platforms makes it simple for a wide range of users to share information quickly and easily, potentially reaching a wide international audience. It is inevitable that amongst this abundance of data there are conflicts, where data sources disagree on the truth regarding a particular object or This paper is an extended version of our previous work [35].
* Joseph Singleton singletonj1@cardiff.ac.uk Richard Booth boothr2@cardiff.ac.uk entity. For example, low-quality sources may mistakenly provide erroneous data for topics on which they lack expertise.
Resolving such conflicts and determining the true facts is therefore an important task. Truth discovery has emerged as a set of techniques to achieve this by considering the trustworthiness of sources [7,19,27]. The general principle is that true facts are those claimed by trustworthy sources, and trustworthy sources are those that claim believable facts. Application areas include real-time traffic navigation [14], drug side-effect discovery [30], crowdsourcing and social sensing [29,38,47].
For a simple example of a situation where trust can play an important role in conflict resolution, consider the following example. Example 1. 1 Let o and p represent two images for which crowdsourcing workers are asked to provide labels (in the truth discovery terminology, o and p are called objects). Consider workers (the data sources) s, t, u and v who put forward potential labels f, g for o, and h, i for p, as shown in Fig. 1; such potential answers are termed facts. In the graphical representation, sources, facts and objects are shown from left to right, and the edges indicate claims made by sources and the objects to which facts relate.
Without considering trust information, the label for o appears a tie, with both options f and g receiving one vote from sources s and t respectively.
Taking a trust-aware approach, however, we can look beyond object o to consider the trustworthiness of s and t. Indeed, when it comes to object p, t agrees with two extra sources u and v, whereas s disagrees with everyone. In principle there could be hundreds of extra sources here instead of just two, in which case the effect would be even more striking. We may conclude that s is less trustworthy than t. Returning to o, we see that g is supported by a more trustworthy source, and conclude that it should be accepted over f.
Many truth discovery algorithms have been proposed in the literature with a wide range of techniques used, e.g. iterative heuristic-based methods [17,34], probabilistic models [45], maximum likelihood estimation and optimisation-based methods [28], and neural network models [24,31,39]. It is common for such algorithms to be evaluated empirically by running them against real-world or synthetic datasets for which the true facts are already known; this allows accuracy and other metrics to be calculated, and permits comparison between algorithms (see [37] for a systematic empirical evaluation of this kind). This may be accompanied by some theoretical analysis, such as calculating run-time complexity [19], proving convergence of an iterative algorithm [46], or proving convergence to the 'true' facts under certain assumptions on the distribution of source trustworthiness [18,41,42].
A limitation of this kind of analysis is that the results only apply narrowly to particular algorithms, due to the assumptions made (for instance, that claims from sources follow a particular probability distribution). Such assumptions can be problematic in domains where the desired truth is somewhat 'fuzzy'; for example, image classification problems and determining the copyright status of books. 1 In this work we take first steps towards a more general approach, in which we aim to study truth discovery without reference to any specific methodology or probabilistic framework. To do so we note the similarities between truth discovery and problems such as judgment aggregation [15], voting theory [50] ranking and recommendation systems [1][2][3]36] in which the axiomatic approach of social choice has been successfully applied. In taking the axiomatic approach one aims to formulate axioms that encode intuitively desirable properties that an algorithm may possess. The interaction between these axioms can then be studied; typical results include impossibility results, where it is shown that a set of axioms cannot hold simultaneously, and characterisation results, where it is shown that a set of axioms are uniquely satisfied by a particular algorithm.
Such analysis brings a new normative perspective to the truth discovery literature. This complements empirical evaluation: in addition to seeing how well an algorithm performs in practise on test datasets, one can check how well it does against theoretical properties that any 'reasonable' algorithm should satisfy. The satisfaction (or failure) of such properties then shines new light on the intuitive behaviour of an algorithm, and may guide development of new ones.
With this in mind, we develop a simplified framework for truth discovery in which axioms can be formulated, and go on to give both an impossibility result and an axiomatic characterisation of a baseline voting algorithm. We also analyse the class of recursive truth discovery algorithms, which includes many existing examples from the literature. In particular, we analyse the well-known algorithm Sums [34] with respect to the axioms.
However, as a first step towards a social choice perspective of truth discovery, our framework involves a number of simplifying assumptions not commonly made in the truth discovery literature.
• Manipulation and collusion. Some of our axioms assume sources are not manipulative: they provide claims in good faith, and do not aim to misinform or artificially improve their standing with respect to the truth discovery algorithm. We also assume sources act independently, i.e. they do not collude with or copy one another. • Object correlations. We do not model correlations between the objects of interest in the truth discovery problem. For example, in a crowdsourcing setting it may be known in advance that two objects o and p are similar, so that the true labels for o and p are correlated; this cannot be expressed in our framework. • Ordinal outputs. For the most part, the outputs of our truth discovery methods consist of rankings of the sources and facts. Thus, we describe when a source is considered more trustworthy than another, but do not assign precise numerical values representing trustworthiness. This breaks with tradition in the truth discovery literature, but is a common point of view in social choice theory.
At first glance these are strong assumptions, and rule out potential applications of our version of truth discovery. However, we argue that the problem is non-trivial even in this simplified setting, and that interesting axioms can still be put forth. The framework as set out here lays the groundwork for these assumptions to be lifted in future work. 1 3 42 Page 4 of 49 The paper is organised as follows. Our framework is introduced and formally defined in the next section. Section 3 provides examples of truth discovery algorithms from the literature expressed in the framework. In Sect. 4 we formally introduce the axioms and present an impossibility result showing a subset of these cannot all be satisfied simultaneously. The examples of Sect. 3 are then revisited in Sect. 5, where we analyse them with respect to the axioms and propose modifications to resolve some axiom failures. In Sect. 6 we extend the framework to allow variable domains of sources, facts and objects, and give an impossibility result similar to that of Sect. 4. We discuss the axioms in Sect. 7 and related work in Sect. 8. We conclude in Sect. 9. Missing proofs are given in appendix A.

An idealised framework for truth discovery
In this section we define our formal framework, which provides the key definitions required for theoretical discussion and analysis of truth discovery methods.
For most of the paper, we consider a fixed domain of finite and mutually disjoint sets S , F and O throughout, called the sources, facts and objects respectively. All definitions and axioms will be stated with respect to these sets. 2

Truth discovery networks
A core definition of the framework is that of a truth discovery network, which represents the input to a truth discovery problem. We model this as a tripartite graph with certain constraints on its structure, in keeping with approaches taken throughout the truth discovery literature [19,45]. 1. For each f ∈ F there is a unique o ∈ O with (f , o) ∈ E , denoted N (f ) . That is, each fact is associated with exactly one object. 2. For s ∈ S and o ∈ O , there is at most one directed path from s to o. That is, sources cannot claim multiple facts for a single object. 3. (S × F) ∩ E is non-empty. That is, at least one claim is made.
We will say that s claims f when (s, f ) ∈ E . Let N denote the set of all TD networks. Figure 1 (page 2) provides an example of a TD network. Note that there is no requirement that a source makes a claim for every object, or even that a source makes any claims at all. This reflects the fact that truth discovery datasets are in practise extremely sparse, i.e. each individual source makes few claims. Conversely, we allow for facts that receive no claims from any sources.
Also note that the object associated with a fact f is not fixed across all networks. This is because we view facts as labels for information that sources may claim, not the pieces of information themselves. Similarly, we consider objects simply as labels for real-world entities. Whilst a particular piece of information has a fixed entity to which it pertains, the labels do not. 3 A special case of our framework is the binary case in which every object has exactly two associated facts. This brings us close to the setting studied in judgment aggregation [15] and, specifically (since sources do not necessarily claim a fact associated to every object) to the setting of binary aggregation with abstentions [9,11]. An important difference, however, is that for simplicity we do not assume any constraints on the possible configurations of true facts across different objects. That is, any combination of facts is feasible. In judgment aggregation such an assumption has the effect of neutralising the impossibility results that arise in that domain (see, e.g., [9]). We shall see that that is not the case in our setting.
To simplify the notation in what follows, for a network N = (V, E) we write for the set of facts claimed by a source s, and N (f ) = {s ∈ S ∶ (s, f ) ∈ E} for the set of sources claiming a fact f.

Truth discovery operators
Having defined the input to a truth discovery problem, the output must be defined. Contrary to many approaches in the truth discovery literature which output numeric trust scores for sources and belief scores for facts [17,34,45,[47][48][49], we consider the primary output to be rankings of the sources and facts. To the extent that we do consider numeric scores, it is only to induce a ranking. This is because we are chiefly interested in ordinal properties rather than quantitative values. Indeed, for the theoretical analysis we wish to perform it is only important that a source is more trustworthy than another; the particular numeric scores produced by an algorithm are irrelevant.
Moreover, the scores produced by existing algorithms may have no semantic meaning [34], and so referring to numeric values is not meaningful when comparing across algorithms. In this case it is only the rankings of sources and facts that can be compared, which is further motivation for our choice. This point of view is also common across the social choice literature.
However, numerical scores do provide valuable information for comparing sources and facts given a fixed algorithm. For example, the magnitude of the difference in trust scores for sources s and t tells us something about confidence: a small difference indicates low confidence in distinguishing s and t-even if one is ranked above the other-whereas a large difference indicates high confidence. In this sense our decision to primarily deal with ordinal outputs (and ordinal axioms) is another simplifying assumption compared to typical truth discovery settings.
For a set X, let L(X) denote the set of all total preorders on X, i.e. the set of transitive, reflexive and complete binary relations on X. We define a truth discovery operator as a function which maps networks to rankings of sources and facts.

Definition 2.2 An ordinal truth discovery operator T (hereafter TD operator) is a mapping
N is a total preorder on S and ⪯ T N is a total preorder on F .
Intuitively, the relation ⊑ T N is a measure of source trustworthiness in the network N according to T, and ⪯ T N is a measure of fact believability; s 1 ⊑ T N s 2 means that source s 2 is at least as trustworthy as source s 1 , and f 1 ⪯ T N f 2 means fact f 2 is at least as believable as fact f 1 . The notation ⊏ T N and ≃ T N will be used to denote the strict and symmetric orders induced by ⊑ T N respectively. For fact rankings, ≺ T N and ≈ T N are defined similarly. Note that for simplicity the fact ranking ⪯ T N compares all facts, even those associated with different objects in N.
To capture existing truth discovery methods we introduce numerical operators, which assign each source a numeric trust score and each fact a belief score.
is the trust score for s in the network N according to T; for f ∈ F , T N (f ) is the belief score for f. The set of all numerical TD operators will be denoted by T Num .
Note that any numerical operator T naturally induces an ordinal operator T , where It is worth noting that yet other truth discovery algorithms output neither rankings nor numeric scores for facts, but only a single 'true' fact for each object [10,28,43]. This is also the approach taken in judgment aggregation, where an aggregation rule selects which formulas are to be taken as true. In the case of finitely many possible facts, such algorithms can be modelled in our framework as numerical operators where T N (f ) = 1 for each identified 'true' fact f, and T N (g) = 0 for other facts g. To go in the reverse direction and obtain the 'true' facts according to an operator, one may simply select the set of facts for each object that rank maximally.

Examples of truth discovery operators
Our framework can capture some operators that have been proposed in the truth discovery literature. In this section we provide two concrete examples: Voting, which is a simple approach commonly used as a baseline method, and Sums [34]. We go on to outline the class of recursive operators-of which Sums is an instance-which contains many more examples from the literature.

Voting
In Voting, we consider each source to cast 'votes' for the facts they claim, and facts are ranked according to the number of votes received. Clearly this method disregards the source trustworthiness aspect of truth discovery, as a vote from one source carries as much weight as a vote from any other. As such, Voting cannot be considered a serious contender for truth discovery. It is nonetheless useful as a simple baseline method against which to compare more sophisticated methods.

Definition 3.1
Voting is the numerical operator defined as follows: for any network N ∈ N , s ∈ S and f ∈ F , T N (s) = 1 and T N (f ) = | N (f )|.
Consider the network N shown in Fig. 1. Facts f, g and h each receive one vote, whereas i receives 3. The fact ranking induced by Voting is therefore f ≈ g ≈ h ≺ i . On the other hand, all sources receive a trust score of 1 and therefore rank equally.

Sums
Sums [34] is a simple and well-known operator adapted from the Hubs and Authorities [21] algorithm for ranking web pages. The algorithm operates iteratively and recursively, assigning each source and fact a sequences of scores, with the final scores taken as the limit of the sequence.
Initially, scores are fixed at a constant value of 1/2. The trust score for each source is then updated by summing the belief score of its associated facts. Similarly, belief scores are updated by summing the trust scores of the associated sources. To prevent these scores from growing without bound as the algorithm iterates, they are normalised at each iteration by dividing each trust score by the maximum across all sources (belief scores are normalised similarly).
Expressed in our framework, we have that if T is the (numerical) operator giving the scores at iteration n, then the pre-normalisation scores at iteration n + 1 are given by T ′ , where Consider again the network N shown in Fig. 1. It can be shown that, with T denoting the limiting scores from Sums with normalisation, we have T N (s) = 0 , T N (t) = 1 , and 2∕2 . The induced ranking of sources is therefore s ⊏ u ≃ v ⊏ t. For fact scores, we have T N (f ) = 0 , T N (g) =

√
2 − 1 , T N (h) = 0 and T N (i) = 1 , so the ranking is f ≈ h ≺ g ≺ i . Note that fact g fares better under Sums than Voting, due to its association with the highly-trusted source t.

Recursive truth discovery operators
The iterative and recursive aspect of Sums is hoped to result in the desired mutual dependence between trust and belief scores: namely that sources claiming high-belief facts are seen as trustworthy, and vice versa. In fact, this recursive approach is near universal across the truth discovery literature (see for instance [14,17,28,44,48,49]). As such it is appropriate to identify the class of recursive operators as an important subset of T Num . To make a formal definition we first define an iterative operator. Definition 3.2 An iterative operator is a sequence (T n ) n∈ℕ of numerical operators. An iterative operator is said to converge to a numerical operator T * if lim n→∞ T n N (z) = T * N (z) for all networks N and z ∈ S ∪ F . In such case the iterative operator can be identified with the ordinal operator induced by its limit T * .
Note that it is possible that an iterative operator (T n ) n∈ℕ converges for only a subset of networks. In such case we can consider (T n ) n∈ℕ to converge to a 'partial operator' and identify it with the induced partial ordinal operator; that is, a partial function N → L(S) × L(F) . Recursive operators can now be defined as those iterative operators where T n+1 can be obtained from T n .

Definition 3.3 An iterative operator (T n ) n∈ℕ is said to be recursive if there is a function
In this context the mapping U ∶ T Num → T Num is called the update function, and the initial operator T 1 is called the prior operator. For a prior operator T and update function U, we write (T, U) for the associated recursive operator; that is, T 1 = T and T n+1 = U(T n ).
Returning to Sums, we see that Eq. (3.1) defines a mapping T Num → T Num and consequently an update function U Sums . The normalisation step can be considered a separate update function which maps any numerical operator T to T ′ , where 4 It can then be seen that Sums is the recursive operator (T fixed , Many other existing algorithms proposed in the literature can also be realised as recursive operators in the framework, such as Investment, PooledInvestment [34], TruthFinder [45], LDT [48] and others.

Axioms for truth discovery
Having laid out the formal framework, we now introduce axioms for truth discovery. To start with, we consider axioms which encode a desirable theoretical property that we believe any 'reasonable' operator T should satisfy. Several properties of this nature can be obtained by adapting existing axioms from the social choice literature (e.g. from voting [8], ranking systems [2,36] and judgement aggregation [15]), to our framework.
However, the correspondence between truth discovery and classical social choice problems-such as voting-has its limits. To show this, we translate the famous Independence of Irrelevant Alternatives (IIA) axiom [4] to our setting, and argue that it is actually an undesirable property. Indeed, it will be seen that this translated axiom, in combination with two basic desirable axioms, leads to Voting-like behaviour in every network, which is undesirable for the reasons given in Sect. 3.1. Furthermore, a slight strengthening of the IIA axiom completely characterises the fact ranking component of Voting. These results formalise the intuition that truth discovery's consideration of source-trustworthiness leads to fundamental differences from classical social choice problems.
Afterwards, we will revisit the specific operators of the previous section to check which axioms are satisfied.

Coherence
As mentioned previously, a guiding principle of truth discovery is that sources claiming highly believed facts should be seen as trustworthy, and that facts backed by highly trusted sources should be seen as believable.
Whilst this intuition is difficult to formalise in general, it is possible to do so in particular cases where there are obvious means by which to compare the set of facts for two sources (and vice versa). This situation is considered in the axiomatic analysis of ranking and reputation systems under the name Transitivity [2,36], and we adapt it to truth discovery in this section. A preliminary definition is required. Definition 4.1 Let T be a TD operator, N be a TD network and Y, Y ′ ⊆ F . We say Y is For X, X ′ ⊆ S we define X less trustworthy than X ′ with respect to N and T in a similar way.
In plain English, Y less believable than Y ′ means that the facts in each set can be paired up in such a way that each fact in Y ′ is at least as believable as its counterpart in Y, and at least one fact in Y ′ is strictly more believable. Now, consider a situation where N (s 1 ) is less believable than N (s 2 ) . In this case the intuition outlined above tells us that s 2 provides 'better' facts, and should thus be seen as more trustworthy than s 1 . A similar idea holds if N (f 1 ) is less trustworthy than N (f 2 ) for some facts f 1 , f 2 . We state this formally as our first axiom.

Axiom 1 (Coherence) For any network N,
Coherence can be broken down into two sub-axioms: Source-Coherence, where the first implication regarding source rankings is satisfied; and Fact-Coherence, where the second implication is satisfied. We take Coherence to be a fundamental desirable axiom for TD operators.

Symmetry
Our next axiom requires that rankings of sources and facts should not depend on their 'names', but only on the structure of the network. To state it formally, we need a notion of when two networks are essentially the same but use different names.
In the theory of voting in social choice, Symmetry as above is expressed as two axioms: Anonymity, where output is insensitive to the names of voters, and Neutrality, where output is insensitive to the names of alternatives [50]. Analogous axioms are also used in judgment aggregation.
Symmetry can also be broken down into sub-axioms where the above need only hold for a subset of permutations satisfying some condition: Source-Symmetry (where must leave facts and objects fixed) and Fact-Symmetry (where leaves sources and objects fixed). For truth discovery we have the additional notion of objects, and thus Object-Symmetry can defined be similarly.

Fact ranking axioms
Next, we introduce axioms that dictate the ranking of particular facts in cases where there is an 'obvious' ordering. Unanimity and Groundedness express the idea that if all sources are in agreement about the status of a fact, then an operator should respect this in its verdict. Two obvious ways in which sources can be in agreement are when all sources believe a fact is true, and when none believe a fact is true.
Then for any other That is, f cannot do better than to be claimed by all sources when T satisfies Unanimity, and cannot do worse than to be claimed by none when T satisfies Groundedness. Unanimity here is a truth discovery rendition of the same axiom in judgment aggregation, and can also be compared to the weak Paretian property in voting [8]. Groundedness is a version of the same axiom studied in the analysis of collective annotation [25].
The next axiom is a monotonicity property, which states that if f receives extra support from a new source s, then its ranking should receive a strictly positive boost. 5 Note that we do not make any judgement on the new ranking of s.
. Write E for the set of edges in N, and let N ′ be the network in which s claims f; i.e. the network with edge set Note that the axioms in this section assume sources do not have 'negative' trust levels. That is, we assume that support from even the most untrustworthy source still has a positive effect on the believability of a fact. Consequently, these axioms are not suitable in the presence of knowledgable but malicious sources who always claim false facts. Indeed, otherwise a fact claimed only by a 'negative' source should rank strictly worse than a fact with no sources, but this goes against Groundedness. Similarly, receiving extra support from a negative source should worsen a fact's ranking, contrary to Monotonicity. Moreover, Monotonicity implicitly assumes sources act independently, i.e. they do not collude with one another. 6 While these assumptions may appear somewhat strong, we argue that this 'simple' case-with no 'negative' sources or collusion-is already non-trivial and permits interesting axiomatic analysis. We therefore view Unanimity, Groundedness and Monotonicity as desirable properties for TD operators.

Independence axioms
We now come to exploring the differences between truth discovery and other social choice problems via independence axioms. In voting, this takes the form of Independence of Irrelevant Alternatives (IIA), which requires that the ranking of two alternatives A and B depends only on the individual assessments of A and B, not on some 'irrelevant' alternative C.
An analogous truth discovery axiom states that the ranking of facts f 1 and f 2 for some object o depends only on the claims relating to o. Intuitively, this is not a desirable property. Indeed, we have already seen in example 1.1 that the claims for object p in the network from Fig. 1 can play an important role in determining the ranking of f and g for object o, but the adapted IIA axiom precludes this.
This undesirability can be made precise. First, we must state the axiom formally.
Considering Fig. 1 again, POI implies that the ranking of f and g remains the same if the claims for h and i are removed. But in this case, Symmetry implies f ≈ g . Similarly, the ranking of h and i remains the same if the claims for f and g are removed. In this case, This observation forms the basis of the following result, which formalises the undesirability of POI: in the presence of our less controversial requirements of Symmetry and Monotonicity, it forces Voting-like behaviour within −1 N (o) for each o ∈ O . We note that, for the special case of binary networks, similar results have been shown in the literature on binary aggregation with abstentions [9].

Theorem 4.1 Let T be any operator satisfying Symmetry, Monotonicity and POI. Then for
Proof (sketch) We will sketch the main ideas of the proof here with some technical details omitted; see appendix A for the full proof. Let N be a network, o be an object and . Consider N ′ obtained by removing from N all claims for objects other than o. By POI, we have Let be the permutation which swaps f 1 with f 2 and swaps each source in N � (f 1 ) with one in N � (f 2 ) ; then we have (N � ) = N � , and Symmetry of T gives are removed, and all other claims remain. By Symmetry as above, Applying Monotonicity again as above we get f 1 ≻ T N ′ f 2 and the required contradiction. ◻ Recall that Coherence formalises the idea that source-trustworthiness should inform the fact ranking, and vice versa. Clearly Voting does not conform to this idea, and in fact even the object-wise voting patterns in Theorem 4.1 are incompatible with Coherence. This can easily be seen in the network in Fig. 1 where, regarding object p, we have . If Coherence held this would give s ⊏ T N t , but then N (f ) is less trustworthy than N (g) , giving f ≺ T N g -a contradiction. From this discussion and Theorem 4.1 we obtain as a corollary the following first impossibility result for truth discovery.

Theorem 4.2 There is no TD operator satisfying Coherence, Symmetry, Monotonicity and POI.
Given that Theorem 4.1 characterises the fact ranking of Voting for facts relating to a single object, it is natural to ask if there is a stronger form of independence that guarantees this behaviour across all facts. As our next result shows, the answer is yes, and the necessary axiom is obtained by ignoring the role of objects altogether for fact ranking.

Axiom 7 (Strong Independence) For any networks
That is, the ranking of two facts f 1 and f 2 is determined solely by the sources claiming f 1 and f 2 . In particular, the fact-object affiliations and claims for facts other than f 1 , f 2 are irrelevant when deciding on f 1 versus f 2 . Note that Strong Independence implies POI. We have the following result.

Theorem 4.3 Suppose |O| ≥ 3 . Then an operator T satisfies Strong Independence, Monotonicity and Symmetry if and only if for any network N and
Theorem 4.3 can be seen as a characterisation of the class of TD operators that rank facts in the same way as Voting. The proof is similar to that of Theorem 4.1, but uses a different transformation to obtain a modified network N ′ in the first step.
We have established that neither POI nor Strong Independence are satisfactory axioms for truth discovery, and a weaker independence property is required. Figure 1 can help us once again in this regard. Whereas POI and Strong Independence would say that facts h and i are irrelevant to f, the argument with Coherence for Theorem 4.2 suggests otherwise due the indirect links via the sources. We therefore propose that only when there is no (undirected) path between two nodes can we consider them to be truly irrelevant to each other. That is, nodes are relevant to each other iff they lie in the same connected component of the network. Our final rendering of independence states that the ordering of two facts in the same connected component does not depend on any claims outside of the component, and similarly for sources.
In analogy with Source/Fact Coherence and Source/Fact Symmetry, it is possible to split the two requirements of PCI into sub-axioms Source-PCI (in which only the constraint on source ranking is imposed) and Fact-PCI (in which only the fact ranking is constrained).
Note that while our framework can be easily adapted to require by definition that a network is itself connected (and therefore has only one connected component), we have found that datasets with multiple connected components do indeed occur in practise. 7 This means that failure of PCI is a real issue, and consequently we consider PCI to be another core axiom that all reasonable operators should satisfy.

Satisfaction of the axioms
With the axioms formally defined, we can now consider whether they are satisfied by the example operators of Sect. 3. Voting can be analysed outright; for Sums we require some preliminary results giving sufficient conditions for iterative and recursive operators to satisfy various axioms. It will be seen that neither Voting nor Sums satisfy all our desirable axioms, but it is possible to modify each operator to gain some improvement with respect to the axioms.

Voting
As the simplest operator, we consider Voting first. The following theorem shows that all axioms except Coherence are satisfied. Since Coherence is a fundamental principle of truth discovery, and we actually consider POI and Strong Independence to be undesirable, this formally rules out Voting as a viable operator. The proof is straightforward, and is deferred to appendix A. Note that once Symmetry, Monotonicity and POI are shown, the fact that Voting fails Coherence follows from our impossibility result (Theorem 4.2), and Fig. 1 serves as an explicit counterexample.

Iterative and recursive operators
In this section we give sufficient conditions for iterative and recursive operators to satisfy various axioms. These results will be useful in what follows when analysing Sums, although they may also be applied more generally to other operators.
Coherence. To analyse whether the limit of a recursive operator satisfies Coherence, we consider how the update function U behaves when the difference in belief scores between the facts of s 1 and s 2 is 'small' (and similarly for the sources of f 1 , f 2 ). To that end, we introduce a numerical variant of a set of facts Y being 'less believable' than Y ′ .
For X, X ′ ⊆ S , we define X ( , )-less trustworthy than X ′ similarly.
This generalises definition 4.1 by relaxing the requirement that f ⪯ T N (f ) , and instead requiring that f can only be more believable than (f ) by some threshold > 0 . Definition 4.1 is recovered in the limiting case → 0 . We obtain a sufficient condition on the update function U for a recursive operator to satisfy Source-Coherence.
For any prior operator T prior , (T prior , U) satisfies Source-Coherence if the following condition is satisfied: there exist C, D > 0 such that for all networks N and numerical operators T it holds that if The proof of Lemma 5.1 uses the following result, the proof of which is a straightforward application of the definition of the limit.

Lemma 5.2 Let N be a truth discovery network and (T n ) n∈ℕ be a convergent iterative oper-
Analogous statements for source rankings also hold.
Proof (Lemma 5.1) Let N be a network. Suppose U has the stated property and that (T prior , U) = (T n ) n∈ℕ converges to T * . Suppose N (s 1 ) is less trustworthy than N (s 2 ) with respect to N and T * under a bijection . We must show that The second part of Lemma 5.2 therefore applies; let be as given there.
, we may apply Lemma 5.2 with f , (f ) and ̄= ∕C to get that there is K ∈ ℕ such that and for all n ≥ K . In other words, Since D is positive and does not depend on , we get s 1 ⊏ T * N s 2 by Lemma 5.2. This shows that T * satisfies Source-Coherence. ◻ A similar result gives conditions under which Fact-Coherence is satisfied.

that for all networks N and numerical operators T it holds that if
Proof The proof proceeds in an identical way to Lemma 5.1; the only difference is that we may simply take K � = K in the final step. ◻ Note that there is asymmetry between Lemmas 5.1 and 5.3-in the condition on U in Lemma 5.1 we have N (s 1 ) ( , )-less trustworthy than N (s 2 ) with respect to T, whereas in Lemma 5.3 the corresponding condition is with respect to T � = U(T) . This reflects the manner in which Sums and other TD operators are typically defined: source trust scores are updated based on the fact scores of the previous iteration, whereas fact belief scores are updated based on the (new) trust scores in the current iteration.
Also note that the above results still hold if U has the stated property only for 'small' ; that is, if there is a constant 0 < < 1 such that the property holds for all and for all < . Symmetry and PCI. When considering either Symmetry or PCI for an iterative operator (T n ) n∈ℕ , it is not enough to know that each T n satisfies the relevant axiom. The following example illustrates this fact for Symmetry.

Example 5.1
Fix some f ∈ F , and define an iterative operator by That is, each T n is a modification of Voting in which we boost the score of all facts tied with f under Voting by 1 − 1 n+1 . Since this additional weight is (strictly) less than 1 for each n, the ordinal operator induced by T n is simply Voting, and therefore satisfies Symmetry. However, it is easy to see that the limit operator T * has T * N (f ) = | N (f )| + 1 ; this means T * uses extra information beyond the structure of the network N in its ranking (namely, the identity of a selected fact f ) which violates Symmetry.
Using a similar tactic, one can construct a sequence of numerical operators (T n ) n∈ℕ such that each T n satisfies PCI, but the limit operator T * does not.
Fortunately, there is a natural strengthening of both Symmetry and PCI for numerical operators which is preserved in the limit. Let us say that a numerical operator T satisfies numerical Symmetry if for any equivalent networks N, Clearly numerical Symmetry implies Symmetry, and numerical PCI implies PCI. The following result is immediate.
• If T n satisfies numerical Symmetry for each n ∈ ℕ , then T * satisfies Symmetry.
• If T n satisfies numerical PCI for each n ∈ ℕ , then T * satisfies PCI.
As a consequence of Lemma 5.4, any recursive operator (T prior , U) satisfies Symmetry whenever T prior satisfies numerical Symmetry and U preserves numerical Symmetry, in the sense that U(T) satisfies numerical Symmetry whenever T does (and similarly for PCI).
Unanimity, Groundedness and Monotonicity. In contrast to Symmetry and PCI, both Unanimity and Groundedness are preserved when taking the limit of an iterative operator.
Lemma 5.5 Suppose (T n ) n∈ℕ converges to T * . Then • If T n satisfies Unanimity for each n ∈ ℕ , then T * satisfies Unanimity. • If T n satisfies Groundedness for each n ∈ ℕ , then T * satisfies Groundedness.
For Monotonicity, we require the following (stronger) property to hold for each T n . In this case we write N,N � = min g≠f ( (f ) − (g)) > 0.
Here (g) is the amount by which the belief score for g increases when going from the network N to N ′ . Improvement simply says that when adding a new source to a fact f, it is f that sees the largest increase.
The requirement that inf n∈ℕ n N,N � > 0 is a technical condition which ensures the strict inequality g ≺ T * N � f holds in the limit, as required for Monotonicity. If this condition fails T * still satisfies a natural 'weak Monotonicity' axiom, in which the strict inequality

Sums
We come to the axiomatic analysis of Sums. Coherence and the simpler axioms are satisfied here, and the undesirable independence axioms (POI and Strong Independence) are not. However, Monotonicity and PCI do not hold. Since PCI is one of our most important axioms that we expect any reasonable operator to satisfy, this potentially limits the usefulness of Sums in practise. Proof (sketch) Symmetry, Unanimity and Groundedness can be easily shown from Lemmas 5.4 and 5.5; the details can be found in the appendix. In the remainder of the proof, (T n ) n∈ℕ will denote the iterative operator Sums, T * will denote the limit operator, and U = •U Sums will denote the update function for Sums. Coherence. We will show Source-Coherence using Lemma 5.1. The argument for Fact-Coherence is similar (using Lemma 5.3) and can be found in the appendix.
Suppose N ∈ N , T ∈ T Num , , > 0 , and N (s 1 ) is ( , )-less believable than N (s 2 ) with respect to N and T under a bijection ∶ N (s 1 ) → N (s 2 ) . By By the remark after the proof of Lemma 5.1, we may assume without loss of generality that < 1 |F| . Recall that the update function for Sums is U = Note at this stage that it is possible to further weaken the hypotheses of Lemma 5.1: the result follows if U has the stated property not for all operators T, but only for those such that T = T n for some n ∈ ℕ . Next, note that if T � N (x) = 0 for all x ∈ S then trust and belief scores are 0 in all subsequent iterations, and thus all sources rank equally in the limit T * . But this means the hypothesis for Source-Coherence cannot be satisfied (there are no strict inequalities). We may therefore assume without loss of generality that T � N (x) ≠ 0 for at least one x ∈ S . Therefore, by definition of , where Applying the definition of U Sums and using the pairing of N (s 1 ) and N (s 2 ) via , we have To complete the proof, we need to find a lower bound for that is independent of T and N (note that a lower bound on is required since |F| − is negative). It is here that we use the assumption that T = T n for some n ∈ ℕ . Since T n N (x) ∈ [0, 1] for any n ∈ ℕ and x ∈ S , we haveT and so Combining this with the above bound for T N (s 1 ) −T n (s 2 ) , we get Taking C = 1 and D = 1 |F| , the hypotheses of Lemma 5.1 are satisfied; thus Sums satisfies Source-Coherence.
POI, Strong Independence, PCI and Monotonicity. The remaining axioms are handled by counterexamples derived from the network shown in Fig. 2. It can be shown that, if N denotes this network, we have T * N (f ) = T * N (g) = 0 , so f ≈ T * N g. Let N ′ denote the network whose claims are just those of the top connected component. Then it can be shown that T * N � (f ) = 1 and T * N � (g) = 0 , i.e. g ≺ T * N � f . However it is easily verified that our three independence axioms, if satisfied, would each imply f ⪯ T * N g iff f ⪯ T * N � g . Therefore none of POI, Strong Independence and PCI can hold for Sums.
For Monotonicity, consider the network N ′′ obtained from N by removing the edge (u, g). Then we still have T * N �� (f ) = T * N �� (g) = 0 , and in particular f ⪯ T * N �� g . Returning to N amounts to adding extra support for the fact g. Monotonicity would give f ≺ T * N g here, but this is clearly false. Hence Monotonicity is not satisfied by Sums. ◻ The key to the counterexamples derived from Fig. 2 in the above proof lies in the lower connected component, which-restricted to S ∪ F -is a connected bipartite graph. That is, each source x i claims all facts in the component, and each fact y j is claimed by all sources in the component. Moreover, sources elsewhere in the network claim fewer facts than the x i , and facts elsewhere are claimed by fewer sources than the y j .
Since Sums assigns scores by a simple sum, this results in the scores for the x i and y j dominating those of the other sources and facts. The normalisation step then divides Fig. 2 Network which yields counterexamples for POI, Strong Independence, PCI and Monotonicity for Sums these scores by the (comparatively large) maximum. As the next result shows, under certain conditions this causes scores to decrease exponentially and become 0 in the limit. In particular, we can generate pathological examples such as Fig. 2 where a whole connected component receives scores of 0, which leads to failure of Monotonicity and the independence axioms.
Then, with (T n ) n∈ℕ denoting Sums, for all n > 1 we have In particular, if T * denotes the limit of Sums then T * Proof We proceed by induction. The result is easy to show in the base case n = 2 since | N (s)| ≤ 1 2 | N (x)| for any x ∈ X and s ∉ X (and similarly for facts). Assume the result holds for some n > 1 . Write T � = U Sums (T n ) , so that T n+1 = (T � ) . If s ∉ X then On the other hand, the fact that T n N (x) = T n N (y) = 1 for x ∈ X and y ∈ Y gives Clearly the x ∈ X and y ∈ Y are the sources and facts with maximal trust and belief scores, respectively. This means that after normalisation via , T n+1 N (x) = T n+1 N (y) = 1 and for s ∉ X and f ∉ Y, This shows that the claim holds for n + 1 ; by induction, the proof is complete. ◻

Modifying Voting and Sums
So far we have seen that neither of the basic operators Voting or Sums are completely satisfactory with respect to the axioms of Sect. 4. Armed with the knowledge of how and why certain axioms fail, one may wonder whether it is possible to modify the operators accordingly so that the axioms are satisfied. Presently we shall show that this is partially possible both in the case of Voting and Sums.

Voting
A core problem with Voting is that it fails Coherence. Indeed, all sources are ranked equally regardless of the 'votes' for facts, so in some sense it is obvious that the source ranking does not cohere with the fact ranking. 8 An easy improvement is to explicitly construct the source ranking to guarantee Source-Coherence.

Definition 5.3
For a network N, define a binary relation ⊲ N on S by s 1 ⊲ N s 2 iff N (s 1 ) is less believable than N (s 2 ) with respect to Voting. The numerical operator SC-Voting (Source-Coherence Voting) is defined by It can be seen that SC-Voting satisfies Source-Coherence, which is a significant improvement over regular Voting. Since ⊲ N relies on 'global' properties on N, however, this comes at the expense of Source-PCI. Satisfaction of the other axioms is inherited from Voting.
For irreflexivity, suppose for contradiction that s ⊲ N s for some s ∈ S , i.e. F = N (s) is less believable than itself under some bijection ∶ F → F . Then f ⪯ T N (f ) for each f ∈ F , and there is f such that f ≺ T N (f ) . Iterating applications of , we get for each n ≥ 1 , where n is the n-th iterate of and T denotes Voting.
Since F is finite, the sequence (f ), ( (f )), … must repeat at some point, i.e. there is i < j such that i (f ) = j (f ) . But then injectivity of implies that f = j−i (f ) . Taking n = j − i in Eq. (5.1) we get f ≺ T Nf -a contradiction. ◻ Proof (Theorem 5.3 (sketch)) Note that SC-Voting inherits Unanimity, Groundedness, Monotonicity, Fact-PCI, POI and Strong Independence from Voting, since these axioms only refer to the rankings of facts (which is the same for SC-Voting as for Voting).
We take the remaining axioms in turn. To simplify notation, write Source-Coherence. Suppose N (s 1 ) is less believable than N (s 2 ) with respect to T SCV . We need to show s 1 ⊏ T SCV N s 2 . Note that since the fact ranking for T SCV coincides with Voting, we have s 1 ⊲ N s 2 . Transitivity of ⊲ N gives W N (s 1 ) ⊆ W N (s 2 ) . Moreover, s 1 ∈ W N (s 2 ) but by irreflexivity, Symmetry. Since the fact ranking of T SCV is the same as Voting, which satisfies Symmetry, we only need to show that s 1 ⊑ T SCV N s 2 iff (s 1 ) ⊑ T SCV (N) (s 2 ) for all equivalent networks N, (N) and s 1 , s 2 ∈ S. In can be shown, and we do so in the appendix, that the Symmetry of Voting implies a symmetry property for ⊲ N and ⊲ (N) : we have s 1 ⊲ N s 2 iff (s 1 ) ⊲ (N) (s 2 ) . Consequently, t ∈ W N (s i ) iff (t) ∈ W (N) ( (s i )) ; in particular, |W N (s i )| = |W (N) ( (s i ))| . This means as required for Symmetry.
Fact-Coherence Consider the network shown in Fig. 3. We have f ≈ g ≈ i ≺ h . Source-Coherence between s and t gives t ⊏ s . If Fact-Coherence held we would then get g ≺ f , but this is not the case.
Source-PCILet N 1 denote the top connected component of the network shown in Fig. 4, and let N 2 denote the network as a whole. The fact ranking is the same in both networks: In N 2 sources t and u can be compared for Source-Coherence, and we see that This contradicts Source-PCI, which requires the ranking of s and t to be the same in both networks. ◻ Note that the idea behind SC-Voting can be generalised beyond Voting: it is possible to define ⊲ N in terms of any operator T, and to construct a new operator T ′ whose source ranking is given according to ⊲ N as above, and whose fact ranking coincides with that of T. This ensures T ′ satisfies Source-Coherence whilst keeping the existing fact ranking from T.
Moreover we can go in the other direction and ensure Fact-Coherence whilst retaining the source ranking of T by defining a relation ◀ N on F in a analogous manner to ⊲ N , and proceeding similarly.

Sums
Our main concern with Sums is the failure of PCI and Monotonicity. We have seen that this is in some sense caused by the normalisation step: in Fig. 2 the scores of f, g etc go to 0 in the limit after dividing the 'global' maximum score across the network, which happens to come from a different connected component.
A natural fix for PCI is to therefore divide by the maximum score within each component. In this case the score for a source s depends only on the structure of the connected component in which it lies, which is exactly what is required for PCI.
However, this approach does not negate the undesirable effects of proposition 5.2. Indeed, suppose the network in Fig. 2 was modified so that fact y 1 is associated with object o instead of p 1 . Clearly proposition 5.2 still applies after this change, and all sources and facts shown now belong to the same connected component. Therefore the 'per-component Sums' operator gives the same result as Sums itself, and in particular our Monotonicity counterexample still applies. Perhaps even worse, one can show that Coherence fails for this operator. We consider the loss of Coherence too high a price to pay for PCI.
Instead, let us take a step back and consider if normalisation is truly necessary. On the one hand, without normalisation the trust and belief scores are unbounded and therefore do not converge. On the other, we are not interested in the numeric scores for their own sake, but rather for the rankings that they induce. It may be possible that whilst the scores diverge without normalisation, the induced rankings do converge to a fixed one, which we may take as the 'ordinal limit'. This is in fact the case. We call this new operator UnboundedSums.

Definition 5.4 UnboundedSums is the recursive operator
Theorem 5.4 UnboundedSums is ordinally convergent in the following sense: there is an ordinal operator T * such that for each network N there exists J N ∈ ℕ such that T n N (s 1 ) ≤ T n N (s 2 ) iff s 1 ⊑ T * N s 2 for all n ≥ J N and s 1 , s 2 ∈ S (and similarly for facts).
That is, the rankings induced by T n remain constant after J N iterations, and are identical to those of T * .
Proof The proof will use some results from linear algebra, so we will work with a matrix and vector representation of UnboundedSums. Fix an enumeration S = {s 1 , … , s k } of S and F = {f 1 , … , f l } of F . Write M for the k × l matrix given by We also write v n and w n for the vectors of trust and belief scores of UnboundedSums at iteration n; that is where (T n ) n∈ℕ denotes UnboundedSums.
Multiplication by M encodes the update step of UnboundedSums: it is easily shown that v n+1 = Mw n and w n+1 = M ⊤ v n+1 . Writing A = MM ⊤ ∈ ℝ k×k , we have v n+1 = Av n , and therefore v n+1 = A n v 1 .
To show that the rankings of UnboundedSums remain constant after finitely many iterations, we will show that for each s p , s q ∈ S there is J pq ∈ ℕ such that sign ([v n ] p − [v n ] q ) is constant for all n ≥ J pq . Since [v n ] p and [v n ] q are the trust scores of s p and s q respectively in the n-th iteration, this will show that the ranking of s p and s q remains the same after J pq iterations. Since there are only finitely many pairs of sources, we may then take J N as the maximum value of J pq over all pairs (p, q), and the entire source ranking ⊑ T n N of Unbound-edSums remains constant for n ≥ J N . An almost identical argument can be carried out for the fact ranking, and these together will prove the result.
So, fix s p , s q ∈ S . Write n = [v n ] p − [v n ] q . First note that A = MM ⊤ is symmetric, so the spectral theorem gives the existence of k orthogonal eigenvectors x 1 , … , x k for A [5,Theorem 7.29]. Let 1 , … , k be the corresponding eigenvalues. Form a (k × k)-matrix P whose i-th column is x i , and let D = diag( 1 , … , k ) . Then A can be diagonalised as A = PDP −1 . It follows that for any n ∈ ℕ , A n = PD n P −1 .
Now, since x 1 , … , x k are orthogonal, P is an orthogonal matrix, i.e. P ⊤ = P −1 . Hence A n = PD n P ⊤ . Note that and which means We obtain an explicit formula for n+1 : Note that r i does not depend on n. Now, it is easy to see that A = MM ⊤ is positive semi-definite, which means its eigenvalues 1 , … , k are all non-negative. We re-index the sum in Eq. (5.2) by grouping together the i which are equal, to get where K ≤ k , each R t is a sum of the r i (whose corresponding i are equal), and the t are distinct and non-negative. Assume without loss of generality that 1 > 2 > ⋯ > K ≥ 0 . If R t = 0 for all t, then clearly sign ( n+1 ) = sign (0) = 0 which is constant, so we are done. Otherwise, let T be the minimal t such that R t ≠ 0 . We may also assume T > 0 (otherwise we necessarily have T = 0 , T = K and sign ( n+1 ) = sign (R T ⋅ 0 n ) which is again constant 0). Observe that By our assumption on the ordering of the t , we have t < T in the sum. Consequently | t ∕ T | < 1 , and ( t ∕ T ) n → 0 as n → ∞ . This means Since this limit is non-zero, there is J pq ∈ ℕ such that the sign of term in square brackets is equal to S = sign R T ∈ {1, −1} for all n ≥ J pq . Finally, for such n we have i.e. sign n is constant for n ≥ J pq + 1 . This completes the proof. 10 ◻ In light of Theorem 5.4, we may consider UnboundedSums itself as an ordinal operator T * , where s ⊑ T * N t iff s ⊑ T J n N t for each network N (and similarly for the fact ranking). Moreover, with the normalisation problems aside, UnboundedSums provides an example of a principled operator satisfying our two key axioms-Coherence and PCI. In particular, we escape the undesirable behaviour of Sums in Fig. 2; whereas Sums trivialises the ranking of sources and facts in the upper connected component, UnboundedSums allows meaningful ranking (e.g. we have g ≺ f ). In particular, the counterexample for Monotonicity for Sums is no longer a counterexample for UnboundedSums. We conjecture that UnboundedSums also satisfies Monotonicity, but this remains to be proven. For example, we have experimentally verified that UnboundedSums satisfies all the specific instances of Monotonicity with the starting network N as in Fig. 1.

Proof (sketch)
The proof that UnboundedSums satisfies Symmetry, PCI, Unanimity and Groundedness is routine, and we leave the details to the appendix. In what follows, let (T n ) n∈ℕ denote UnboundedSums, T * denote the ordinal limit of UnboundedSums, and for a network N let J N be as in Theorem 5.4. Then the rankings in N induced by T n for n ≥ J N are the same as T * .
Coherence. First we show Source-Coherence. Let N be a network and suppose N (s 1 ) is less believable than N (s 2 ) with respect to N and T * . Let and f be as in the definition of less believable. 10 The argument which shows that the difference between fact belief scores is also eventually constant in sign is almost identical. Write B = M ⊤ M , and observe that w n+1 = B n w 1 . Since B is also symmetric and positive semi-definite, the proof goes through as above.
i.e. T n+1 gives the same ranking as T n N and therefore the same ranking as T * , so we get s 1 ⊏ T * N s 2 as required. For Fact-Coherence, suppose N (f 1 ) is less trustworthy than N (f 2 ) with respect to N and T * . Again, let n ≥ J N and , ŝ be as in the definition of less trustworthy. Recall that belief scores for facts in T n N are obtained from trust scores in T n N ; applying the same argument as above we get T n N (f 1 ) < T n N (f 2 ) and consequently f 1 ⪯ T * N f 2 as required. Hence T * satisfies Coherence.
POI and Strong Independence. To show POI and Strong Independence are not satisfied, consider the network N shown in Fig. 5. It can be seen (e.g. by induction) that T n N (f ) = 1, T n N (g) = 2 n−1

Fig. 5 Counterexample for POI and Strong Independence for UnboundedSums
for all n ∈ ℕ . Consequently f ≺ T * N g. 11 Now let N ′ be the network in which the claim (t, h) is removed. Since N (f ) = N � (f ) = {s} and N (g) = N � (g) = {t} , both POI and Strong Independence imply f ⪯ T * N g iff f ⪯ T * N � g . Therefore assuming either of POI or Strong Independence we get f ≺ T * N � g . However is is also clear that for all n ∈ ℕ , so f ≈ T * N � g . This is a contradiction, so neither POI nor Strong Independence are satisfied. ◻ To summarise this section, Table 1 shows which axioms are satisfied by each of the operators.

Variable domain truth discovery
Like Symmetry, Isomorphism simply says that operators only care about the structure of the network, not the particular 'names' chosen for sources, facts and objects. Symmetry is obtained as the special case where N and N ′ are equivalent when seen as networks in a common domain D . All the operators of Sects. 3 and 5.4 satisfy Isomorphism.
Next Then for all f , g ∈ F , f ≠ g , there is a finite, non-empty set S ′ ⊆ with S ∩ S � = � and g ≺ T This axiom says that, given any pair of distinct facts f, g, it is possible to add enough new claims for f to make f strictly more believable than g. Intuitively, this is less demanding that Monotonicity, which requires that f can become strictly more believable than g with the addition of just one additional claim. Note that Eventual Monotonicity is not possible to state in the fixed domain case (e.g. consider N already containing claims from all the available sources in S).
When paired with Isomorphism, Eventual Monotonicity takes on a form similar to postulates for Improvement and Majority operators in belief merging [22,23]: there is a threshold n ∈ ℕ such that f becomes strictly more believable than g after n new claims are added for f. That is, the identities of the new sources S ′ are irrelevant; it is just the size of S ′ that matters. We require a preliminary lemma.

Lemma 6.1 Suppose a variable domain operator T satisfies Isomorphism. Let D = (S, F, O) be a domain, N a network in D and f ∈ F . Then for all non-empty, finite sets
Define a mapping from D 1 to D 2 by Then it is easily verified that is an isomor-

◻
Note that since is infinite and domains are finite, for any n ∈ ℕ and any domain D = (S, F, O) there is always some S ′ ⊆ , disjoint from S , with |S � | = n . For operators T satisfying Isomorphism, write ⪯ T N+(n×f ) for ⪯ T N+(S � ,f ) ; Lemma 6.1 guarantees this is welldefined (i.e. does not depend on the particular choice of S ′ ). That is, ⪯ T N+(n×f ) is the fact ranking resulting from adding n new claims for f from fresh sources. We obtain an equivalent characterisation of Eventual Monotonicity, whose proof is almost immediate given Lemma 6.1.
for h ≠ f , and thus g ≺ T U nboundedSums To conclude this section, we show that the impossibility result of Theorem 4.2 holds in the variable domain case if one replaces Monotonicity with Eventual Monotonicity and Symmetry with Isomorphism.

Theorem 6.2 There is no variable domain operator satisfying Coherence, Isomorphism, Eventual Monotonicity and POI.
Proof For contradiction, suppose T is an operator satisfying the stated axioms. Let N be the network from Fig. 1, viewed as a network in domain ({s, t, u, v}, {f , g, h, i}, {o, p}) . Applying Eventual Monotonicity with i and h, we have that there is Since N ′ only adds claims for p-facts, POI applied to object o and Isomorphism give f ≈ T N � g (e.g. consider which simply swaps s with t and f with g). From Source-Coherence we get t ⊏ T N ′ s . But N � (f ) = {s} and N � (g) = {t} , so Fact-Coherence gives g ≺ T N ′ f : contradiction! ◻

Discussion
In this section we discuss the axioms and their limitations. First, the version of Monotonicity we consider is a strict one: a new claim for f gives f a strictly positive boost in the fact believability ranking. This is also the case for Eventual Monotonicity in the variable domain case, where we require that some number of new claims make f strictly more believable than any other fact g. As noted in Sect. 4.3, this assumes there is no collusion between sources. Indeed, suppose sources s 1 , s 2 are in collusion. For example, s 2 may agree to unconditionally back up all claims made by s 1 . In this case a claim of f from s 1 alone should carry just as much weight as the claim from both s 1 and s 2 . However, Monotonicity requires that f becomes strictly more believable when moving to the latter case. A natural solution is to simply relax the strictness requirement. We obtain the following weak version of Monotonicity.

Axiom 11 (Weak Monotonicity)
Let N, s, f , N ′ be as in the statement of Monotonicity.
Weak Monotonicity only says says that extra support for a fact f does not make f less believable. Clearly Monotonicity implies Weak Monotonicity, but not vice versa. In the collusion example, an operator may select to leave the fact ranking unchanged when a new report of f from s 2 arrives; this is inconsistent with Monotonicity but consistent with Weak Monotonicity. The weak analogue of Eventual Monotonicity can be defined in the same way.
In the same spirit, one could consider versions of Coherence only using weak comparisons. Say N (s 1 ) is weakly less believable than N (s 2 ) iff the condition in definition 4.1 holds, but without the requirement that some f ∈ N (s 1 ) is strictly less believable than its counterpart (f ) in N (s 2 ) , and define N (f 1 ) weakly less trustworthy than N (f 2 ) in a similar way. The weak analogue of Coherence is as follows.

Axiom 12 (Weak Coherence) For any network N,
N (s 1 ) weakly less believable than N (s 2 ) implies s 1 ⊑ T N s 2 , and N (f 1 ) weakly less trustworthy than Note that Coherence does not imply Weak Coherence. This is because Weak Coherence relaxes both the consequent and the antecedent in the implications in the statement of the axiom. Whereas Coherence imposes no constraint when N (s 1 ) is only weakly less believable than N (s 2 ) , Weak Coherence requires s 1 ⊑ T N s 2 . Consequently, the 'weakness' of Weak Coherence refers to its use of weak comparisons between sources and facts, not its logical strength in relation to Coherence.
A natural question now arises. Does the impossibility result of Theorem 4.2 still hold with these new axioms?
We have an answer in the negative: the 'flat' operator, which sets all sources and facts equally ranked in all networks, satisfies all the axioms of the would-be impossibility. Proof Coherence holds vacuously since we can never have N (s 1 ) less believable than N (s 2 ) or N (f 1 ) less believable than N (f 2 ) . Since any weak ranking holds for T, the other axioms are straightforward to see. ◻ This shows that (strict) Monotonicity is required for the impossibility result, since the result is no longer true when relaxing to Weak Monotonicity.
We now consider the new axioms in relation to the operators. First, Weak Coherence.

Proposition 7.2 Voting, Sums and UnboundedSums satisfy Weak Coherence
Proof (sketch) Voting Since s 1 ⊑ T V oting N s 2 always holds, Weak Source-Coherence clearly holds. For Weak Fact-Coherence, suppose N (f 1 ) is weakly less trustworthy than N (f 2 ) . Then there is a bijection Sums First, one may adapt definition 5.1 to a numerical variant of a set of facts Y being weakly less believable than Y ′ , by dropping all references to . We then have an analogue of Lemma 5.1, and Weak Coherence for Sums follows by an argument similar to the one used to show Coherence using Lemma 5.1.
UnboundedSums The proof that UnboundedSums satisfies Coherence can be adapted in a straightforward way to show Weak Coherence. ◻ Proposition 7.2 indicates that Weak Coherence may in fact be too weak to capture the original intuition behind Coherence-namely, that there should be a mutual dependence between trustworthy sources and believable facts-since it does not even rule out Voting. Instead, Weak Coherence can be seen as a simple requirement which only rules out undesirable behaviour, and complements (strict) Coherence.
Since Monotonicity implies Weak Monotonicity, it is clear that Voting satisfies Weak Monotonicity. We conjecture that Weak Monotonicity also holds for Sums and Unbounded-Sums, but this remains to be proven. 12

Related work
In this section we discuss related work.
Ranking systems. Altman and Tennenholtz [2] initiated axiomatic study of ranking systems. First we discuss their framework in relation to ours, and then turn to their axioms. In their framework, a ranking system F maps any (finite) directed graph G = (V, E) to a total preorder ≤ F G on the vertex set V. In their view this is a variation of the classical social choice setting, in which the set of voters and alternatives coincide. Nodes v ∈ V "vote" on their peers in V by a form of approval voting [26]: an edge v → u is interpret as a vote for u from v. A ranking system then outputs a ranking of V, following the general intuition that the more "votes" v receives (i.e. the more incoming edges), the higher v should rank. As with the ranking of facts in truth discovery, this does not necessarily mean ranking nodes simply by the number of votes received, since the quality of the voters should also be taken in account. For example, a ranking system may prioritise nodes which receive few votes from highly ranked nodes over those with many votes from lower ranked nodes.
Note that our truth discovery networks N are themselves directed graphs on the vertex set S ∪ F ∪ O . However, naively applying a ranking system to N directly makes little sense: sources never receive any "votes", and the resulting ranking includes objects, which do not need to be ranked in our setting. Perhaps a more sensible approach is to consider the bipartite graph G N = (V N , E N ) associated with a network N, where That is, we take the edges from sources to facts together with the reversal of such edges. The edges in G N have some intuitive interpretation: a source votes for the facts which it claims are true, and a fact votes for the sources who vouch for it. Any ranking system F thus gives rise to a truth discovery operator, where s 1 ⊑ T N s 2 iff s 1 ≤ F G N s 2 , and similar for facts.
However, some characteristic aspects of the truth discovery problem are lost in this translation to ranking systems. Notably, the objects play no role at all in G N . Sources and facts are also treated symmetrically, where they perhaps should not be. For example, a fact f receiving more claims than g is beneficial for f, all else being equal (see Monotonicity), but a source s claiming more facts than t does not tell us anything about the relative trustworthiness of s and t.
While other choices of G N may be possible to alleviate some of these problems, we believe the truth discovery is sufficiently specialised beyond general graph ranking so that a bespoke modelling is required to capture its nuances appropriately. Our framework provides this novel contribution.
In [2], Altman and Tennenholtz also introduce axioms for ranking systems. Their first set of axioms deal with the transitive effects of voting when the alternatives are the voters themselves. As mentioned in Sect. 4, these axioms provided the inspiration for Coherence. The core idea is that if the predecessors of a node v are weaker than those of u, then v should be ranked below u. If v additionally has more predecessors, v should rank strictly below. Coherence applies this idea both in the direction of sources-to-facts (Fact-Coherence) and from facts-to-sources (Source-Coherence). A notable difference is that we only consider the case where the number of sources for two facts (or the number of facts, for two sources) is the same. For example, a source claiming more facts does not give it the strict boost Transitivity would dictate. Under the mapping N ↦ G N described above, any ranking system satisfying Transitivity induces a truth discovery operator which satisfies Coherence.
The other axiom in [2] is an independence axiom RIIA (ranked independence of irrelevant alternatives), which adapts the classical IIA axiom from social choice theory to the ranking system setting, although in a different manner to our independence axioms of Sect. 4.4. We describe the axiom in rough terms, deferring to the paper for the technical details. Suppose the relative ranking of u 1 's predecessors compared to u 2 's predecessors is the same as that of v 1 's compared to v 2 's. Then RIIA requires u 1 ≤ u 2 iff v 1 ≤ v 2 . Informally, "the relative ranking of two agents must only depend on the pairwise comparison of the ranks of their predecessors" [2]. While we do not have an analogous axiom, the idea can be adapted to truth discovery networks. Intuitively, such an axiom would state that the ranking of two facts depends only on comparisons between their corresponding sources (and similar for the ranking of sources).
However, the main result of Altman and Tennenholtz is an impossibility: Transitivity is incompatible with RIIA. Moreover, the result remains true even when restricting to bipartite graphs, such as G N described above. Accordingly, we can expect a similar impossibility result to hold in the truth discovery setting between Coherence and any analogue of RIIA.
PageRank. PageRank [33] is a well-known algorithm for ranking web pages based on the hyperlink structure of the web, viewed as a directed graph. It has also been studied through the lens of social choice and characterised axiomatically [1,40]. 13 In [1] the authors propose several invariance axioms, each of which requires that the ranking of pages is not affected by a certain transformation of the web graph. For example, the axiom Self Edge says that adding a self loop from a page a to itself does not change the relative ranking of other pages, and results in a strictly positive boost for a (c.f. Monotonicity). However, if we identify a truth discovery network N with the graph G N as described above, most of the transformations involved do not respect the bipartite, symmetric structure of G N . That is, the transformed graph does not correspond to any G N ′ , for a network N ′ . Consequently, the PageRank axioms have no truth discovery counterpart in our setting. The only exception is Isomorphism, where the transformation in question is graph isomorphism; this axiom is analogous to our Symmetry and Isomorphism axioms. However, since PageRank is similar to the Hubs and Authorities [21] algorithm on which Sums is based-which also uses the link structure of the web to rank pages-we expect there may be additional axioms which can be expressed both for general graphs and truth discovery networks, satisfied by PageRank and Sums. We leave the task of finding such axioms to future work.

Conclusion
In this paper we formalised a mathematical framework for truth discovery. While a number of simplifying assumptions were made compared to the mainstream truth discovery literature, we are able to express several algorithms in the framework. This provided the setting for the axiomatic method of social choice to be applied. To our knowledge, this is the first such axiomatic treatment in this context.
It was possible to adapt many axioms from social choice theory and related areas. In particular, the Transitivity axiom studied in the context of ranking systems [2,36] took on new life in the form of Coherence, which we consider a core axiom for TD operators. We proceeded to establish the differences between our setting and classical social choice by considering independence axioms. This led to an impossibility result and an axiomatic characterisation of the baseline Voting method.
On the practical side, we analysed the existing TD algorithm Sums and found that, surprisingly, it fails PCI. This is a serious issue for Sums which has not been discussed in the literature to date, and its discovery here highlights the benefits of the axiomatic method. To resolve this, we suggested a modification to Sums-which we call UnboundedSums-for which PCI is satisfied. However, while UnboundedSums resolves axiomatic problems with Sums, it may introduce computational difficulties (since the numeric scores involved grow without bound). We leave further investigation of such issues to future work.
A restriction of our analysis is that only one 'real-world' algorithm was considered. Further axiomatic analysis of algorithms provides a deeper understanding of how algorithms operate on an intuitive level, but is made difficult by the complexity of the stateof-the-art truth discovery methods. New techniques for establishing the satisfaction (or otherwise) of axioms would be helpful in this regard.
There is also scope for extensions to our model of truth discovery in the framework itself. For example, even in the variable domain setting of Sect. 6, we make the somewhat simplistic assumption that there are only finitely many possible facts for sources to claim. This effectively means we can only consider categorical values; modelling an object whose domain is the set of real numbers, for example, is not straightforward in our framework. Next, our model does not account for any associations or constraints between objects, whereas in reality the belief in a fact for one object may strengthen or weaken our belief in other facts for related objects. These types of constraints or correlations have been studied both on the theoretical side (e.g. in judgment aggregation) and practical side in truth discovery [44].
The axioms can also be further refined to relax some of the simplifying assumptions we make regarding source attitudes; e.g. that they do not collude or attempt to manipulate. Most notably, Monotonicity should be weakened to account for such sources.
Finally, it may be argued that truth discovery as formulated in this paper risks simply to find consensus among sources, rather than the truth. To remedy this, the framework could be extended to model the possible states of the world and thus the ground truth (c.f. [32]). Upon doing so one could investigate how well, and under what conditions, an operator is able to recover the truth from a TD network. Such truth-tracking methods have also been studied in judgment aggregation and belief fusion [16,20].

Proof of theorem 4.1
The following lemma is required before the proof.
Then for any Symmetric operator T and f 1 , Proof Suppose N has the stated property, T satisfies symmetry, and . Note that since Now for the reverse direction: we must show E ′ ⊆ E . Let (x, y) ∈ E � . By definition of a graph isomorphism, we have ( −1 (x), −1 (y)) ∈ E . Using −1 = and the first part we get ( (x), (y)) = ( −1 (x), −1 (y)) ∈ E ⊆ E � . The definition of a graph isomorphism then yields (x, y) ∈ E and so E ′ ⊆ E . Hence E = E � and N = N � .
To conclude the proof, we apply Symmetry of T to get and so f 1 ≈ T N f 2 as required.
Let N ′ be the network obtained from N by removing all claims for facts other than those for object o; that is, N � = (V, E � ) where E is the set of edges in N and Note that the fact-object affiliations are the same in N ′ as in N, and the set of sources for each fact in −1 N (o) is the same. Therefore POI applies, and it is sufficient to show that Removing k sources from f 2 to obtain a new network N ′′ , we can apply the lemma to get f 1 ≈ T N �� f 2 . We may then add these sources back to obtain N ′ again; k applications of Monotonicity then give f 1 ≺ T N ′ f 2 as required. To complete the proof we show that Then the argument above gives f 1 ≻ T N ′ f 2 , which is clearly a contradiction. Hence the proof is complete. ◻

Proof of theorem 4.3
The proof of this theorem is similar in spirit to that of Theorem 4.1. Like in that case, a preliminary result is required first.

Lemma A.2 Let N be a network and
. Suppose N has the following properties: Then for any operator T satisfying Symmetry, Then |Q 1 | = |Q 2 | , so there exists a bijection ∶ Q 1 → Q 2 . Define a permutation as follows: That is, swaps f 1 and f 2 , swaps o 1 and o 2 , and swaps sources in Q 1 with their counterparts in Q 2 . Note that = −1 . Write N � = (N) . We claim that N � = N . Write E, E ′ for the edges in N and N ′ respectively. First we show that E ⊆ E ′ . For this, first suppose (s, f ) ∈ E for some s ∈ S , f ∈ F . By definition of E, either f = f 1 or f = f 2 .
Note that these two cases cover all possibilities since by hypothesis We see that in all cases, ( (f ), (f )) ∈ E . Applying the same argument as in case 1 above, we see that To complete the claim that N = N � we must show E ′ ⊆ E . This can be shown using the same argument used in Lemma A.1. Indeed, suppose (x, y) ∈ E � . Then by definition of a graph isomorphism, ( −1 (x), −1 (y)) ∈ E . Using the fact that = −1 and E ⊆ E ′ we get ( (x), (y)) ∈ E � , which then yields (x, y) ∈ E as required. Hence E = E � and N = N � .
Finally, using Symmetry of T we have Then let N ′ be the network with edge set E ′ . Note that N � (f j ) = N (f j ) . By Strong Independence it is therefore sufficient to show that are removed, and all other claims remain. By the lemma, f 1 ≈ T N �� f 2 . Applying Monotonicity k times we can produce N ′ from N ′′ and get f 1 ≺ T N ′ f 2 as desired. For the other implication, suppose f 1 ⪯ T N � f 2 and, for contradiction, Now let s ∈ S . Clearly we have T N (s) = 1 = T (N) ( (s)) . Hence T satisfies numerical Symmetry and therefore Symmetry. Unanimity and Groundedness. Suppose N ∈ N and f ∈ F . If N (f ) = S then for any g ∈ F , so g ⪯ T N f and Unanimity is satisfied. If instead N (f ) = � , we have so f ⪯ T N g and Groundedness is satisfied.
Monotonicity. Let N, N ′ , s and f be as given in the statement of Monotonicity. It is clear that | N � (f )| = | N (f )| + 1 . Also, for any g ∈ F , g ≠ f , the set of sources in N ′ is the same as in N but with s possibly removed. Hence | N � (g)| ≤ | N (g) . Therefore g ⪯ T N f implies and so g ≺ T N ′ f as required. Independence axioms. Next we show Strong Independence, which implies POI. Sup- For s ∈ G ∩ S , we trivially have T N 1 (s) = 1 = T N 1 (s) . Hence numerical PCI is satisfied.
Coherence. The violation of Coherence follows from Theorem 4.2, since we have already shown that Symmetry, Monotonicity and POI are satisfied. ◻

Proof of lemma 5.2
Proof The first statement follows easily from the definition of the limit. We shall prove only the second one.
. We need to show that D < 0 . Write d n = T n N (f 1 ) − T n N (f 2 ) so that D = lim n→∞ d n . Take = ∕2 > 0 . Then for sufficiently large n we have d n ≤ − ∕2 < 0 . Taking n → ∞ , we have D = lim n→∞ d n ≤ − ∕2 < 0 as required.
For the 'only if' direction, suppose D < 0 . Let = −D . Then for any > 0 , by the definition of the limit there is K ∈ ℕ such that |d n − D| < for n ≥ K ; in particular, d n < + D = − as required. ◻

Proof of theorem 5.2
The following results will be helpful to simplify the Proof of Theorem 5.2.

Lemma A.3
has the following properties.
preserves numerical Symmetry, in the sense that (T) satisfies numerical Symmetry whenever T does.

2.
leaves rankings unchanged, in the following sense. For T ∈ T Num , N ∈ N , Proof (Theorem 5.2) Throughout this proof, (T n ) n∈ℕ will denote the iterative operator Sums, T * will denote the limit operator, and U = •U Sums will denote the update function for Sums.
Coherence. Source-Coherence was shown in the body of the paper. The proof that Fact-Coherence is satisfied is similar, and uses Lemma 5.3. Suppose N ∈ N , T = T n for some n ∈ ℕ , , > 0 , and N (f 1 ) is ( , )-less trustworthy than N (f 2 ) with respect to N and T under a bijection , where T = U(T) . Let ŝ ∈ N (f 1 ) be such that T N (s) −T N ( (s)) ≤ − .
Write T � = U Sums (T) so that T = (T � ) , and set We may assume without loss of generality that < 1 |S| . Note that for s ∈ S , T N (s) = T � N (s) and therefore T � N (s) = 1T N (s) . Writing and applying a similar argument as for showing Source-Coherence, we find Now we need to bound ∕ from below. Since we assume T = T n for some n ∈ ℕ , for any y ∈ F we have Therefore Next, we claim there is some fact f ∈ F with T N (f ) ≥ 1∕2 and N (f ) ≠ � . Indeed, if T = T 1 = T fixed then take any fact with at least one associated source. 14 Otherwise, since we assume not all scores are 0 in the limit, there is some f with T N (f ) = 1 due to the application of . Clearly N (f ) ≠ � , since we would have T N (f ) = 0 otherwise. Let x ∈ N (f ) . Then Symmetry. As a consequence of Lemma 5.4, to show Symmetry it is sufficient to show that T fixed satisfies numerical Symmetry, and that U = •U Sums preserves numerical Symmetry. Since T fixed is constant with value 1/2, it is clear that numerical Symmetry is satisfied. Moreover, Lemma A.3 part (i) already shows that preserves numerical Symmetry, so we only need to show that U Sums does.
Applying numerical symmetry for T, we get Following the same tactic, one may also show that T � (N) ( (f )) = T � N (f ) for all f ∈ F . Hence U Sums preserves numerical Symmetry, and we are done.
Unanimity and Groundedness. Unanimity and Groundedness can be proved together using Lemma 5.5 and corollary A.1. By these results it is sufficient that T fixed satisfies Unanimity and Groundedness-this is trivial-and that U Sums preserves them. Suppose T satisfies Unanimity and Groundedness and write T � = U Sums (T) . Assume without loss of generality that T = T n for some n ∈ ℕ so that T ′ N ≥ 0 . Suppose N ∈ N , f ∈ F and that N (f ) = S . Let g ∈ F . We must show that g ⪯ T � N f . We have i.e. g ⪯ T � N f as required for Unanimity. For Groundedness, suppose N (f ) = � . We must show f ⪯ T � N g . Indeed, the sum in the expression for T � N (f ) is taken over the empty set, which by convention is 0. Since T � N (g) ≥ 0 , we are done. ◻

Proof of theorem 5.3
Proof Here we give only the technical details for the argument showing SC-Voting satisfies Symmetry, since the results for the other axioms were given in the main text. Symmetry. Since Voting satisfies Symmetry, it is clear that (N) (f 2 ) for any equivalent networks N and (N) . We need to show that s 1 ⊑ T SCV N s 2 iff (s 1 ) ⊑ T SCV (N) (s 2 ). First we will show that ⊲ N and ⊲ (N) have a similar symmetry property: s 1 ⊲ N s 2 iff (s 1 ) ⊲ (N) (s 2 ) . Indeed, suppose s 1 ⊲ N s 2 . Then there is a bijection ∶ N (s 1 ) → N (s 2 ) with f ⪯ T SCV N (f ) , and there is some f with f ≺ T SCV N (f ). It can be seen that restricted to N (s i ) is a bijection into (N) ( (s i )) . Let 1 and 2 denote these restrictions for i = 1, 2 respectively. Set = 2 • • −1 1 , so that maps (N) ( (s 1 )) into (N) ( (s 2 )) . As a composition of bijections, is itself bijective. Let g ∈ (N) ( (s 1 )) . Write f = −1 1 (g) ∈ N (s 1 ) . By the property of , we have By the symmetry property of the fact-ranking (which follows from symmetry of Voting), we can apply to the above to get Since f ∈ N (s 1 ) and (f ) ∈ N (s 2 ) , we have (f ) = 1 (f ) and ( (f )) = 2 ( (f )) . Using this fact in the above inequality and recalling f = −1 (g) we get i.e. g ⪯ T SCV (N) (g) . Applying the same argument with ĝ = −1 1 (f ) we get ĝ ≺ T SCV (N) (ĝ). This shows that (N) ( (s 1 )) is less believable than (N) ( (s 2 )) with respect to SC-Voting (whose fact-ranking coincides with Voting) in (N) under . Hence (s 1 ) ⊲ (N) (s 2 ).

Proof of theorem 5.5
Proof Here we show that UnboundedSums satisfies Symmetry, PCI, Unanimity and Groundedness, since the other axioms were dealt with in the main body of the paper. Throughout the proof, let (T n ) n∈ℕ denote UnboundedSums, T * denote the ordinal limit of UnboundedSums, and for a network N let J N be as in Theorem 5.4. Then the rankings in N induced by T n for n ≥ J N are the same as T * .
Symmetry. In the Proof of Theorem 5.2, we saw that the update function U Sums preserves numerical Symmetry, in the sense that if T satisfies numerical Symmetry then U Sums (T) does also. Since it is clear that the prior operator for UnboundedSums satisfies numerical Symmetry, T n satisfies numerical Symmetry and consequently Symmetry for all n ∈ ℕ. Now, let N and (N) be equivalent networks. Let J, J � ∈ ℕ be such that T * (N) and T * ( (N)) are given by T J N and T J � (N) respectively and take n ≥ max{J, J � } . For s 1 , s 2 ∈ S we have by Symmetry of T n , as required for Symmetry. Using an identical argument, one can show that f 1 ⪯ T * N f 2 iff (f ) ⪯ T * (N) (f 2 ) . Hence T * satisfies Symmetry. PCI. As with Symmetry, we will show that T n satisfies numerical PCI, and consequently PCI, for all n ∈ ℕ . Let N 1 , N 2 be networks with a common connected component G. Let s ∈ G ∩ S and f ∈ G ∩ F . Note that N 1 (s) = N 2 (s) and N 1 (f ) = N 2 (f ) since by definition a source is connected to its facts and vice versa. For n = 1 we have so T 1 has numerical PCI. Supposing T n has numerical PCI for some n ∈ ℕ , we have Hence, by induction, T n has numerical PCI for all n ∈ ℕ , and we are done. Unanimity and Groundedness. For Unanimity, suppose N (f ) = S . For any g ∈ F and n ∈ ℕ we have so g ⪯ T n N f for all n ∈ ℕ . Since the ranking of T * corresponds to T n for large n, we have g ⪯ T * N f as required For Groundedness, suppose N (f ) = � . Then T n N (f ) = 0 for all n ∈ ℕ . For any g ∈ F , we have T n N (g) ≥ 0 = T n N (f ) . Consequently f ⪯ T n N g for all n ∈ ℕ . As above, this gives f ⪯ T * N g as required. ◻

Conflict of interest
The authors declare that they have no conflict of interest.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http:// creat iveco mmons. org/ licen ses/ by/4. 0/.