# Conditioning and Independence

• Kai Lai Chung
Part of the Undergraduate Texts in Mathematics book series (UTM)

## Abstract

We have seen that the probability of a set A is its weighted proportion relative to the sample space Ω. When Ω is finite and all sample points have the same weight (therefore equally likely), then
$$P\left( A \right) = \frac{{\left| A \right|}}{{\left| \Omega \right|}}$$
as in Example 4 of §2.2. When Ω is countable and each point ω has the weight P(ω) = P({ω}) attached to it, then
$$P\left( A \right) = \frac{{\sum\limits_{\omega \in A} {P\left( \omega \right)} }}{{\sum\limits_{\omega \in \Omega } {P\left( \omega \right)} }}$$
(5.1.1)
from (2.4.3), since the denominator above is equal to 1. In many questions we are interested in the proportional weight of one set A relative to another set S. More accurately stated, this means the proportional weight of the part of A in S, namely the intersection A ∩ S, or AS, relative to S. The formula analogous to (5.1.1) is then
$$\frac{{\sum\limits_{\omega \in AS} {P\left( \omega \right)} }}{{\sum\limits_{\omega \in S} {P\left( \omega \right)} }}$$
(5.1.2)
Thus we are switching our attention from Ω to S as a new universe, and considering a new proportion or probability with respect to it. We introduce the notation
$$P\left( {A\left| S \right.} \right) = \frac{{P\left( {AS} \right)}}{{P\left( S \right)}}$$
(5.1.3)
and call it the conditional probability of A relative to S. Other phrases such as “given S,” “knowing S,” or “under the hypothesis [of] S” may also be used to describe this relativity.

## Keywords

Black Ball Conditional Probability Independent Random Variable General Random Variable Apriori Probability
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.