Abstract
In his article ‘The Conception of Probability as a Logical Relation’ [1] A.J. Ayer has criticized the logical interpretation of inductive probability on which J. M. Keynes [22], Rudolf Carnap [6], and many other writers on the philosophy of induction have based their conceptions of inductive inference. Ayer’s criticism is concerned with the principle of total evidence and with the obvious fact that it is often reasonable to collect new evidence when we are studying the credibility of some hypothesis. According to Ayer, it is impossible to understand this simple fact and justify the principle of total evidence, if the concept of inductive probability is interpreted in the way suggested by Keynes, Jeffreys, Carnap, and other proponents of the logical interpretation.
This study has been facilitated by a Finnish State Fellowship (Valtion apuraha nuorille tieteenharjoittajille).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Bibliography
Ayer, Alfred J., ‘The Conception of Probability as a Logical Relation’, in Observation and Interpretation (ed. by S. Körner), Butterworth, London, 1958, pp. 12 - 17.
Ayer, A. J., Böhm, D. et al., ‘Discussion’ (of Ayer [1)), in Observation and Interpretation (ed. by S. Körner ), Butterworth, London, 1958, pp. 18 - 30.
Bar-Hillel, Yehoshua, Semantic Information and Its Measuresx, in Language and Information (ed. by Y. Bar-Hillel ), Addison-Wesley, Reading, Mass., 1964, pp. 298 - 312.
Broad. C. D., ‘Critical Notice on J. M. Keynes, A Treatise on Probability’, Mind 31 (1922) 72 - 85.
Carnap, Rudolf, ‘The Two Concepts of Probability’, Philosophy and Phenomenological Research 5 (1945) 513 - 532.
Carnap, Rudolf, Logical Foundations of Probability, University of Chicago Press, Chicago, 1950.
Carnap, Rudolf, The Continuum of Inductive Methods, University of Chicago Press, Chicago, 1952.
Carnap, Rudolf, ‘The Aim of Inductive Logic’, in Logic, Methodology and Philosophy of Science (ed. by E. Nagel, P. Suppes, and A. Tarski ), Stanford University Press, Stanford, 1962, pp. 303 - 318.
Carnap, Rudolf, ‘Inductive Logic and Intuition’, The Problem of Inductive Logic (ed. by I. Lakatos), North-Holland Publ. Comp., Amsterdam, 1968, pp. 257- 267.
Carnap, Rudolf and Bar-Hillel, Yehoshua, An Outline of the Theory of Semantic Informationx, in Language and Information (by Y. Bar-Hillel), Addison-Wesley, Reading, Mass., 1964, pp. 221 - 274.
Good, I. J., ‘On the Principle of Total Evidence’. The British Journal for the Philosophy of Science 17 (1967) 319 - 321.
Hempel, Carl G., ‘Inductive Inconsistencies’, Synthese 12 (1960) 439 - 469.
Hempel, Carl G. and Oppenheim, Paul, ‘Studies in the Logic of Explanation’, Philosophy of Science 15 (1948) 131–175.
Hilpinen, Risto,’On Inductive Generalization in Binary First-Order Languages’ (unpublished).
Hintikka, Jaakko, A Two-Dimensional Continuum of Inductive Methods, in Aspects of Inductive Logic (ed. by J. Hintikka and P. Suppes ), North-Holland Publ. Comp., Amsterdam, 1966, pp. 113 - 132.
Hintikka, Jaakko, ‘On Semantic Information’, present volume. Also in Logic, Physical Reality and History, Proceedings of the International Colloquium at the University of Denver (ed. by W. Yourgrau), The Plenum Press, New York (forthcoming).
Hintikka, Jaakko,‘The Varieties of Information and Scientific Explanation’, in Logic. Methodology, and Philosophy of Science III, Proceedings of the 1967 International Congress (cd. by B. v. Rootsclaar and J. F. Staal), North-Holland Publ..Comp., Amsterdam 1968, pp. 311 - 331.
Hintikka, Jaakko and Hilpinen, Risto, ‘Knowledge, Acceptance, and Inductive Logic’, in Aspects of Inductive Logic (ed. by J. Hintikka and P. Suppes ), North- Holland Publ. Comp., Amsterdam, 1966, pp. 1 - 20.
Hosiasson, Janina, ‘Why Do We Prefer Probabilities Relative to Many Data’, Mind 40 (1931) 23 - 32.
Jeffrey. Richard C., The Logic of Decision, McGraw-Hill, New York, 1965.
Kemeny, John G., ‘Fair Bets and Inductive Probabilities’, Journal of Symbolic Logic 20 (1955) 263-273,
Keynes, J. M., A Treatise on Probability, Macmillan, London, 1921.
Khinchin, A. I., Mathematical Foundations of Information Theory, Dover Publications, New York, 1957.
Lenz, John W., Carnap on Defining ‘Degree of Confirmation’, Philosophy of Science 23 (1956) 230 - 236.
Levi, Isaac, Gambling with Truth, Alfred A. Knopf, New York, 1967.
Lindley, D. V., ‘On a Measure of the Information Provided by an Experiment’, Annals of Mathematical Statistics 27 (1956) 986 - 1005.
Meinong, A., ‘Krics, Johannes, v.: Die Principien der Wahrscheinlichkeits-Rechnung’, Cöttingsche Gelehrte Anzeigen (1890) 56 - 75.
Nitsche, Ad., ‘Die Dimensionen der Wahrscheinlichkeit und die Evidenz der Ungewissheit’, Vierteljahresschrift für wissenschaftliche Philosophie 16 (1892) 20 - 35.
Shannon, Claude E., ‘The Mathematical Theory of Communication’, in The Mathematical Theory of Communication (ed. by C. E. Shannon and W. Weaver ), University of Illinois Press, Urbana, 1949, pp. 3 - 91.
Suppes, Patrick, ‘Probabilistic Inference and the Concept of Total Evidence’, in Aspects of Inductive Logic (cd. by J. Hintikka and P. Suppcs), North-Holland Publ. Comp., Amsterdam, 1966, pp. 49 - 65.
Törnebohm, Häkan, ‘Two Measures of Evidential Strength’, in Aspects of Inductive Logic (ed. by J. Hintikka and P. Suppes ), North-Holland Publ. Comp., Amsterdam, 1966, pp. 81 - 95.
Törnebohm, Häkan, ‘On the Confirmation of Hypotheses about Regions of Existence’. Synthese 18 (1968) 28 - 45.
von Wright, G. H., ‘Broad on Induction and Probability’, in The Philosophy of C. D. Broad (ed. by P. A. Schilpp ), Tudor, New York, 1949, pp. 313 - 352.
References
See Carnap [6], p. 211. For the concepts of relevance and irrelevance, see Carnap [6], ch. VI.
In spite of these objections to the logical interpretation of inductive probability, the probability measures defined in inductive logic (e. g. [7] and [16]) can, of course, be called ‘logical’ probabilities. Carnap’s conception of the interpretation of inductive probability seems to have changed after the publication of [6]. In [6], p. 299, “the choice of an m-function is regarded as a purely logical question”. According to [7], the choice of an inductive method depends on “performance, economy, aesthetic satisfaction” (p. 55). In [8] Carnap seems to have shifted towards the subjectivistic conception (see especially p. 315).
Perhaps the word ‘available’ is one source of confusion here.
P T is relative to X, but, for the sake of brevity, explicit reference to X is omitted here. In [8] P T is called by Carnap a (rational) credence function, and P T (h) is called the credence of h for X at T.
The model of the application of inductive logic accepted here is called the conditionalization model. According to this model, ‘learning from experience’ (or rational change in belief) takes place through the conditionalization of the measure P T to the evidence accepted. The limitations of this model have been discussed e. g. by Jeffrey (201, ch. 11, and Suppes [30], pp. 60-65. In many situations, especially in scientific inquiry, this model seems to me, however, fairly realistic. The confirmation function P is called regular, if P (/1 e) = 1 if and only if i is logically implied by e. The concept of regularity is equivalent to ‘strict coherence’ or ‘strict fairness’ (see e. g. Kemeny [21]). If P is strictly coherent, P T (i) = 1 only if i is accepted as evidence at T or logically implied by a sentence accepted as evidence at T. The regularity (or strict coherence) of P can also be defined as follows: P(i) = 0, if and only if i is logically false. Carnap imposes the requirement of strict coherence on all credence functions Cr or, in our terminology, probability measures Pt [8], p. 308; [9]. According to Carnap, P T is regular if P T (i) = 0 only if i is (logically) impossible ([8], p. 308; [9], p. 262). But this requirement is incompatible with the conditionalization model; if X has, in the sense of the conditionalization model, ‘learnt something from experience’, i. e. accepted evidence,PT(eT)=1, although ~e T is not logically false (here e T represents the evidence accepted at T). The requirement of strict coherence should be imposed on the ‘initial’ credence function P only; all reasonable coherence requirements concerning P T can be defined in terms of the coherence of P.
This ‘paradox of the logical interpretation’ has also been pointed out by John W. Lenz in [24], p. 232.
Suggestions for this kind of justification of the rationality of collecting new evidence have been made before Good by other authors. For instance, according to [2], p. 23, footnote I, U. fipik produced a similar argument after the discussion on Ayer’s paper [1] in the Colston Symposium (1957). In 122], p. 77, Keynes says: “We may argue that, when our knowledge is slight but capable of increase, the course of action, which will, relative to such knowledge, probably produce the greatest amount of good, will often consist in the acquisition of more knowledge.”
The word ‘information’ is here used in its loose, presystematic sense, not in the technical sense in which it is used in information theory.
According to Meinong [27], p. 70, “Vermutungen sind um so weniger Wert, je mehr sie auf Unwissheit basieren, bei Gleichsetzung von Vermutungen aber hat man da, wo diese Gleichsetzung durch unser Wissen gefordert, nicht durch unser Nicht-Wissen bloss gestatted wird, den Idealfall vor sich”.
If the measure com used in (24) is replaced with the logarithmic measure of information, we obtain the entropy-expression (44) (sec below p. 115). Hence the term ‘content-entropy’.
In addition to the measures defined below, Hintikka has defined in [17] some other interesting measures of transmitted information which are not discussed here.
Tornebohm also calls the explicatum (36) for (33) the degree of ‘information overlap’ and ‘degree of covering’ ([31], p. 84). These interpretations arc plausible if the amount of information is explicated in terms of cont, e. g. in the case of the measure Q(e|h), but they fail in the case of inf, if (32) and (33) are negative. Sec [31 J, p. 84.
The logarithms are usually assumed to be to the base 2; the choice of the base is obviously a matter of convention.
The crucial difference between these measures concerns additivity; the inf-measures of h1 and h2 are additive, if the sentences are probabilistically independent, that is. P(h 1 &h2) = P(h 1 ) P(h 2 ).
This concept is, of course, different from the concept of degree of confirmation used by Carnap. In this paper, the expression ‘degree of confirmation’ is used in the Carnapian way, i. e. as a synonym of ‘inductive probability’.
Carnap and Bar-Hillel call this measure ‘the amount of specification of H through e’See [10], p. 266.
Lindley does not define (43) in the way in which it is defined here. Lindley’s measure is, however, equivalent to (43) if the number of alternative hypotheses hieH is finite.
This is expressed by Shannon’s ‘fundamental inequality’. Cf. also Khinchin [23], pp. 5-6.
In recent literature on inductive logic it is usually assumed that inf and com are defined in terms of the same measure function on L as the degree of confirmation. Such a definition is presupposed here, also, in the Sections IV and V. This assumption is obvious in the case of inf, if this measure is interpreted in the way described above. In the case of com it has, however, been questioned by Isaac Levi ([25], especially pp. 164-165).
In many cases the increase of the amount of evidence, i. e. the increase of Q(e | h), will necessarily reduce the entropy in H. One interesting case of this kind has been discussed by Hintikka and Hilpinen [18]. Given a suitable probability measure on a monadic first-order language L, the degree of confirmation of one constituent Cω (constituents are the strongest, that is to say, most informative, generalizations in L) approaches 1 when the number of observations increases. Hence the entropy in the set of the constituents of L approaches 0, when the amount of evidence increases.
The condition in question is satisfied by Q.E(e | h) only if cont(h) = cont(k).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1970 D. Reidel Publishing Company, Dordrecht-Holland
About this chapter
Cite this chapter
Hilpinen, R. (1970). On the Information Provided by Observations. In: Hintikka, J., Suppes, P. (eds) Information and Inference. Synthese Library, vol 28. Springer, Dordrecht. https://doi.org/10.1007/978-94-010-3296-4_4
Download citation
DOI: https://doi.org/10.1007/978-94-010-3296-4_4
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-010-3298-8
Online ISBN: 978-94-010-3296-4
eBook Packages: Springer Book Archive