# Mean, Variance and Transforms

• Kai Lai Chung
Part of the Undergraduate Texts in Mathematics book series (UTM)

## Abstract

The mathematical expectation of a random variable, defined in §4.3, is one of the foremost notions in probability theory. It will be seen to play the same role as integration in calculus—and we know “integral calculus” is at least half of all calculus. Recall its meaning as a probabilistically weighted average [in a countable sample space] and rewrite (4.3.11) more simply as:
$$E\left( X \right) = \sum\limits_\omega {X\left( \omega \right)P\left( \omega \right)}$$
(6.1.1)
If we substitute |X| for X above, we see that the proviso (4.3.12) may be written as
$$E\left( {\left| X \right|} \right) < \infty$$
(6.1.2)
We shall say that the random variable X is summable when (6.1.2) is satisfied In this case we say also that “X has a finite expectation (or mean)” or “its expectation exists.” The last expression is actually a little vague because we generally allow E(X) to be defined and equal to +∞ when for instance X ≥ 0 and the series in (6.1.1) diverges. See Exercises 27 and 28 of Chapter 4. We shall say so explicitly when this is the case.

## Keywords

Independent Random Variable Mathematical Expectation Expected Number Multinomial Distribution Nonnegative Random Variable