Page 102 - Probability and Statistical Inference
P. 102
2. Expectations of Functions of Random Variables 79
It will be instructive to find simple examples of continuous random vari-
ables and other discrete random variables with interesting features analogous
to those cited in the Example 2.3.2. These are left as Exercises 2.3.8-2.3.10.
When the Definition 2.2.3 is applied with g(x) = e , one comes up with a
tx
very useful and special function in statistics. Look at the following definition.
Definition 2.3.3 The moment generating function (mgf) of a random vari-
able X, denoted by M (t), is defined as
X
provided that the expectation is finite for | t | < a with some a > 0.
As usual, the exact expression of the mgf M (t) would then be derived
X
analytically using one of the following expressions:
The function M (t) bears the name mgf because one can derive all the mo-
X
ments of X by starting from its mgf. In other words, all moments of X can be
generated from its mgf provided that the mgf itself is finite.
Theorem 2.3.1 If a random variable X has a finite mgf M (t), for | t | < a
X
with some a > 0, then the r moment η of X, given in the Definition 2.3.1, is
th
r
r
the same as d M (t)/dt when evaluated at t = 0.
r
X
Proof Let us first pretend that X is a continuous random variable so that
M (t) = ∫ e f(x)dx. Now, assume that the differentiation operator of M (t) with
tx
χ
X
X
respect to t can be taken inside the integral with respect to x. One may refer to
(1.6.16)-(1.6.17) for situations where such interchanges are permissible. We
write
and then it becomes clear that dM (t)/dt when evaluated at t = 0 will coincide
X
with ∫ xf(x)dx which is η (= µ). Similarly let us use (2.3.3) to claim that
χ
1
Hence, d M (t)/dt when evaluated at t = 0 will coincide with ∫ x f(x)dx
2
2
2
χ
X
which is η . The rest of the proof proceeds similarly upon successive differ-
2
entiation of the mgf M (t). A discrete scenario can be handled by replacing
X
the integral with a sum. !