Skip to main content

On the Information Provided by Observations

  • Chapter
Information and Inference

Part of the book series: Synthese Library ((SYLI,volume 28))

Abstract

In his article ‘The Conception of Probability as a Logical Relation’ [1] A.J. Ayer has criticized the logical interpretation of inductive probability on which J. M. Keynes [22], Rudolf Carnap [6], and many other writers on the philosophy of induction have based their conceptions of inductive inference. Ayer’s criticism is concerned with the principle of total evidence and with the obvious fact that it is often reasonable to collect new evidence when we are studying the credibility of some hypothesis. According to Ayer, it is impossible to understand this simple fact and justify the principle of total evidence, if the concept of inductive probability is interpreted in the way suggested by Keynes, Jeffreys, Carnap, and other proponents of the logical interpretation.

This study has been facilitated by a Finnish State Fellowship (Valtion apuraha nuorille tieteenharjoittajille).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. Ayer, Alfred J., ‘The Conception of Probability as a Logical Relation’, in Observation and Interpretation (ed. by S. Körner), Butterworth, London, 1958, pp. 12 - 17.

    Google Scholar 

  2. Ayer, A. J., Böhm, D. et al., ‘Discussion’ (of Ayer [1)), in Observation and Interpretation (ed. by S. Körner ), Butterworth, London, 1958, pp. 18 - 30.

    Google Scholar 

  3. Bar-Hillel, Yehoshua, Semantic Information and Its Measuresx, in Language and Information (ed. by Y. Bar-Hillel ), Addison-Wesley, Reading, Mass., 1964, pp. 298 - 312.

    Google Scholar 

  4. Broad. C. D., ‘Critical Notice on J. M. Keynes, A Treatise on Probability’, Mind 31 (1922) 72 - 85.

    Article  Google Scholar 

  5. Carnap, Rudolf, ‘The Two Concepts of Probability’, Philosophy and Phenomenological Research 5 (1945) 513 - 532.

    Article  Google Scholar 

  6. Carnap, Rudolf, Logical Foundations of Probability, University of Chicago Press, Chicago, 1950.

    Google Scholar 

  7. Carnap, Rudolf, The Continuum of Inductive Methods, University of Chicago Press, Chicago, 1952.

    Google Scholar 

  8. Carnap, Rudolf, ‘The Aim of Inductive Logic’, in Logic, Methodology and Philosophy of Science (ed. by E. Nagel, P. Suppes, and A. Tarski ), Stanford University Press, Stanford, 1962, pp. 303 - 318.

    Google Scholar 

  9. Carnap, Rudolf, ‘Inductive Logic and Intuition’, The Problem of Inductive Logic (ed. by I. Lakatos), North-Holland Publ. Comp., Amsterdam, 1968, pp. 257- 267.

    Google Scholar 

  10. Carnap, Rudolf and Bar-Hillel, Yehoshua, An Outline of the Theory of Semantic Informationx, in Language and Information (by Y. Bar-Hillel), Addison-Wesley, Reading, Mass., 1964, pp. 221 - 274.

    Google Scholar 

  11. Good, I. J., ‘On the Principle of Total Evidence’. The British Journal for the Philosophy of Science 17 (1967) 319 - 321.

    Google Scholar 

  12. Hempel, Carl G., ‘Inductive Inconsistencies’, Synthese 12 (1960) 439 - 469.

    Article  Google Scholar 

  13. Hempel, Carl G. and Oppenheim, Paul, ‘Studies in the Logic of Explanation’, Philosophy of Science 15 (1948) 131–175.

    Google Scholar 

  14. Hilpinen, Risto,’On Inductive Generalization in Binary First-Order Languages’ (unpublished).

    Google Scholar 

  15. Hintikka, Jaakko, A Two-Dimensional Continuum of Inductive Methods, in Aspects of Inductive Logic (ed. by J. Hintikka and P. Suppes ), North-Holland Publ. Comp., Amsterdam, 1966, pp. 113 - 132.

    Google Scholar 

  16. Hintikka, Jaakko, ‘On Semantic Information’, present volume. Also in Logic, Physical Reality and History, Proceedings of the International Colloquium at the University of Denver (ed. by W. Yourgrau), The Plenum Press, New York (forthcoming).

    Google Scholar 

  17. Hintikka, Jaakko,‘The Varieties of Information and Scientific Explanation’, in Logic. Methodology, and Philosophy of Science III, Proceedings of the 1967 International Congress (cd. by B. v. Rootsclaar and J. F. Staal), North-Holland Publ..Comp., Amsterdam 1968, pp. 311 - 331.

    Google Scholar 

  18. Hintikka, Jaakko and Hilpinen, Risto, ‘Knowledge, Acceptance, and Inductive Logic’, in Aspects of Inductive Logic (ed. by J. Hintikka and P. Suppes ), North- Holland Publ. Comp., Amsterdam, 1966, pp. 1 - 20.

    Chapter  Google Scholar 

  19. Hosiasson, Janina, ‘Why Do We Prefer Probabilities Relative to Many Data’, Mind 40 (1931) 23 - 32.

    Article  Google Scholar 

  20. Jeffrey. Richard C., The Logic of Decision, McGraw-Hill, New York, 1965.

    Google Scholar 

  21. Kemeny, John G., ‘Fair Bets and Inductive Probabilities’, Journal of Symbolic Logic 20 (1955) 263-273,

    Google Scholar 

  22. Keynes, J. M., A Treatise on Probability, Macmillan, London, 1921.

    Google Scholar 

  23. Khinchin, A. I., Mathematical Foundations of Information Theory, Dover Publications, New York, 1957.

    Google Scholar 

  24. Lenz, John W., Carnap on Defining ‘Degree of Confirmation’, Philosophy of Science 23 (1956) 230 - 236.

    Article  Google Scholar 

  25. Levi, Isaac, Gambling with Truth, Alfred A. Knopf, New York, 1967.

    Google Scholar 

  26. Lindley, D. V., ‘On a Measure of the Information Provided by an Experiment’, Annals of Mathematical Statistics 27 (1956) 986 - 1005.

    Article  Google Scholar 

  27. Meinong, A., ‘Krics, Johannes, v.: Die Principien der Wahrscheinlichkeits-Rechnung’, Cöttingsche Gelehrte Anzeigen (1890) 56 - 75.

    Google Scholar 

  28. Nitsche, Ad., ‘Die Dimensionen der Wahrscheinlichkeit und die Evidenz der Ungewissheit’, Vierteljahresschrift für wissenschaftliche Philosophie 16 (1892) 20 - 35.

    Google Scholar 

  29. Shannon, Claude E., ‘The Mathematical Theory of Communication’, in The Mathematical Theory of Communication (ed. by C. E. Shannon and W. Weaver ), University of Illinois Press, Urbana, 1949, pp. 3 - 91.

    Google Scholar 

  30. Suppes, Patrick, ‘Probabilistic Inference and the Concept of Total Evidence’, in Aspects of Inductive Logic (cd. by J. Hintikka and P. Suppcs), North-Holland Publ. Comp., Amsterdam, 1966, pp. 49 - 65.

    Google Scholar 

  31. Törnebohm, Häkan, ‘Two Measures of Evidential Strength’, in Aspects of Inductive Logic (ed. by J. Hintikka and P. Suppes ), North-Holland Publ. Comp., Amsterdam, 1966, pp. 81 - 95.

    Chapter  Google Scholar 

  32. Törnebohm, Häkan, ‘On the Confirmation of Hypotheses about Regions of Existence’. Synthese 18 (1968) 28 - 45.

    Article  Google Scholar 

  33. von Wright, G. H., ‘Broad on Induction and Probability’, in The Philosophy of C. D. Broad (ed. by P. A. Schilpp ), Tudor, New York, 1949, pp. 313 - 352.

    Google Scholar 

References

  1. See Carnap [6], p. 211. For the concepts of relevance and irrelevance, see Carnap [6], ch. VI.

    Google Scholar 

  2. In spite of these objections to the logical interpretation of inductive probability, the probability measures defined in inductive logic (e. g. [7] and [16]) can, of course, be called ‘logical’ probabilities. Carnap’s conception of the interpretation of inductive probability seems to have changed after the publication of [6]. In [6], p. 299, “the choice of an m-function is regarded as a purely logical question”. According to [7], the choice of an inductive method depends on “performance, economy, aesthetic satisfaction” (p. 55). In [8] Carnap seems to have shifted towards the subjectivistic conception (see especially p. 315).

    Google Scholar 

  3. Perhaps the word ‘available’ is one source of confusion here.

    Google Scholar 

  4. P T is relative to X, but, for the sake of brevity, explicit reference to X is omitted here. In [8] P T is called by Carnap a (rational) credence function, and P T (h) is called the credence of h for X at T.

    Google Scholar 

  5. The model of the application of inductive logic accepted here is called the conditionalization model. According to this model, ‘learning from experience’ (or rational change in belief) takes place through the conditionalization of the measure P T to the evidence accepted. The limitations of this model have been discussed e. g. by Jeffrey (201, ch. 11, and Suppes [30], pp. 60-65. In many situations, especially in scientific inquiry, this model seems to me, however, fairly realistic. The confirmation function P is called regular, if P (/1 e) = 1 if and only if i is logically implied by e. The concept of regularity is equivalent to ‘strict coherence’ or ‘strict fairness’ (see e. g. Kemeny [21]). If P is strictly coherent, P T (i) = 1 only if i is accepted as evidence at T or logically implied by a sentence accepted as evidence at T. The regularity (or strict coherence) of P can also be defined as follows: P(i) = 0, if and only if i is logically false. Carnap imposes the requirement of strict coherence on all credence functions Cr or, in our terminology, probability measures Pt [8], p. 308; [9]. According to Carnap, P T is regular if P T (i) = 0 only if i is (logically) impossible ([8], p. 308; [9], p. 262). But this requirement is incompatible with the conditionalization model; if X has, in the sense of the conditionalization model, ‘learnt something from experience’, i. e. accepted evidence,PT(eT)=1, although ~e T is not logically false (here e T represents the evidence accepted at T). The requirement of strict coherence should be imposed on the ‘initial’ credence function P only; all reasonable coherence requirements concerning P T can be defined in terms of the coherence of P.

    Google Scholar 

  6. This ‘paradox of the logical interpretation’ has also been pointed out by John W. Lenz in [24], p. 232.

    Google Scholar 

  7. Suggestions for this kind of justification of the rationality of collecting new evidence have been made before Good by other authors. For instance, according to [2], p. 23, footnote I, U. fipik produced a similar argument after the discussion on Ayer’s paper [1] in the Colston Symposium (1957). In 122], p. 77, Keynes says: “We may argue that, when our knowledge is slight but capable of increase, the course of action, which will, relative to such knowledge, probably produce the greatest amount of good, will often consist in the acquisition of more knowledge.”

    Google Scholar 

  8. The word ‘information’ is here used in its loose, presystematic sense, not in the technical sense in which it is used in information theory.

    Google Scholar 

  9. According to Meinong [27], p. 70, “Vermutungen sind um so weniger Wert, je mehr sie auf Unwissheit basieren, bei Gleichsetzung von Vermutungen aber hat man da, wo diese Gleichsetzung durch unser Wissen gefordert, nicht durch unser Nicht-Wissen bloss gestatted wird, den Idealfall vor sich”.

    Google Scholar 

  10. If the measure com used in (24) is replaced with the logarithmic measure of information, we obtain the entropy-expression (44) (sec below p. 115). Hence the term ‘content-entropy’.

    Google Scholar 

  11. In addition to the measures defined below, Hintikka has defined in [17] some other interesting measures of transmitted information which are not discussed here.

    Google Scholar 

  12. Tornebohm also calls the explicatum (36) for (33) the degree of ‘information overlap’ and ‘degree of covering’ ([31], p. 84). These interpretations arc plausible if the amount of information is explicated in terms of cont, e. g. in the case of the measure Q(e|h), but they fail in the case of inf, if (32) and (33) are negative. Sec [31 J, p. 84.

    Google Scholar 

  13. The logarithms are usually assumed to be to the base 2; the choice of the base is obviously a matter of convention.

    Google Scholar 

  14. The crucial difference between these measures concerns additivity; the inf-measures of h1 and h2 are additive, if the sentences are probabilistically independent, that is. P(h 1 &h2) = P(h 1 ) P(h 2 ).

    Google Scholar 

  15. This concept is, of course, different from the concept of degree of confirmation used by Carnap. In this paper, the expression ‘degree of confirmation’ is used in the Carnapian way, i. e. as a synonym of ‘inductive probability’.

    Google Scholar 

  16. Carnap and Bar-Hillel call this measure ‘the amount of specification of H through e’See [10], p. 266.

    Google Scholar 

  17. Lindley does not define (43) in the way in which it is defined here. Lindley’s measure is, however, equivalent to (43) if the number of alternative hypotheses hieH is finite.

    Google Scholar 

  18. This is expressed by Shannon’s ‘fundamental inequality’. Cf. also Khinchin [23], pp. 5-6.

    Google Scholar 

  19. In recent literature on inductive logic it is usually assumed that inf and com are defined in terms of the same measure function on L as the degree of confirmation. Such a definition is presupposed here, also, in the Sections IV and V. This assumption is obvious in the case of inf, if this measure is interpreted in the way described above. In the case of com it has, however, been questioned by Isaac Levi ([25], especially pp. 164-165).

    Google Scholar 

  20. In many cases the increase of the amount of evidence, i. e. the increase of Q(e | h), will necessarily reduce the entropy in H. One interesting case of this kind has been discussed by Hintikka and Hilpinen [18]. Given a suitable probability measure on a monadic first-order language L, the degree of confirmation of one constituent Cω (constituents are the strongest, that is to say, most informative, generalizations in L) approaches 1 when the number of observations increases. Hence the entropy in the set of the constituents of L approaches 0, when the amount of evidence increases.

    Google Scholar 

  21. The condition in question is satisfied by Q.E(e | h) only if cont(h) = cont(k).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1970 D. Reidel Publishing Company, Dordrecht-Holland

About this chapter

Cite this chapter

Hilpinen, R. (1970). On the Information Provided by Observations. In: Hintikka, J., Suppes, P. (eds) Information and Inference. Synthese Library, vol 28. Springer, Dordrecht. https://doi.org/10.1007/978-94-010-3296-4_4

Download citation

  • DOI: https://doi.org/10.1007/978-94-010-3296-4_4

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-3298-8

  • Online ISBN: 978-94-010-3296-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics