Foundations of a new system of probability theory
- Cite this article as:
- Humburg, J. Topoi (1986) 5: 39. doi:10.1007/BF00137828
The aim of my book is to explain the content of the different notions of probability.
Based on a concept of logical probability, which is modified as compared with Carnap, we succeed by means of the mathematical results of de Finetti in defining the concept of statistical probability.
The starting point is the fundamental concept that certain phenomena are of the same kind, that certain occurrences can be repeated, that certain experiments are ‘identical’. We introduce for this idea the notion: concept K of similarity. From concept K of similarity we derive ‘logically’ some probability-theoretic conclusions:
If the events E(λ) are similar —of the same kind - on the basis of such a concept K, it holds good that intersections of n of these events are equiprobable on the basis of K; in formulae: E(λ1)...E(λ n ∼ K E(λ'1)...E(λ' n , λ i ≠λ j ,λ' j ≠λ' j for i≠j
On the basis of some further axioms a partial comparative probability structure results from K, which forms the starting point of our further investigations and which we call logical probability on the basis of K.
We investigate a metrisation of this partial comparative structure, i.e. normed σ-additive functions m K, which are compatible with this structure; we call these functions m K measure-functions in relation to K.
The measure-functions may be interpreted as subjective probabilities of individuals, who accept the concept K.
Now it holds good: For each measure-function there exists with measure one the limit of relative frequencies in a sequence of the E(λ).
In such an event, where all measure-functions coincide, we speak of a quantitative logical probability, which is the common measure of this event. In formulae we have: l K (h n →lim h n )=1 in words: There is the quantitative logical probability one that the limit of the relative frequencies exists. Another way of saying this is that the event Ω * ≔ (hn → lim h n) is a maximal element in the comparative structure resulting from K.
Therefore we are entitled to introduce this limit and call it statistical probability P.
With the aid of the measure-functions it is possible to calculate the velocity of this convergence. The analog of the Bernoulli inequation holds true: m K (¦h n −P¦⩽∈)⩾1−1/4n∈2.
It is further possible in the work to obtain relationships for the concept of statistical independence which are expressed in terms of the comparative probability.
The theory has a special significance for quantum mechanics: The similarity of the phenomena in the domain of quantum mechanics explains the statistical behaviour of the phenomena.
The usual mathematical statistics are explained in my book. But it seems more expedient on the basis of this new theory to use besides the notion of statistical probability also the notion of logical probability; the notion of subjective probability has only a heuristic function in my system.
The following dualism is to be noted: The statistical behaviour of similar phenomena may be described on the one hand according to the model of the classical probability theory by means of a figure called statistical probability, on the other hand we may express all formulae by means of a function, called statistical probability function. This function is defined as the limit of the relative frequencies depending on the respective state ω of the universe. The statistical probability function is the primary notion, the notion of statistical probability is derived from it; it is defined as the value of the statistical probability function for the true unknown state ω of the universe.
As far as the Hume problem, the problem of inductive inference, is concerned, the book seems to give an example of how to solve it.
The developed notions such as concept, measure-function, logical probability, etc. seem to be important beyond the concept of similarity.