Page 107 - Probability and Statistical Inference
P. 107
84 2. Expectations of Functions of Random Variables
Next, along the lines of (2.2.22)-(2.2.24) we can rewrite the integral on the
rhs of (2.3.20) as a gamma integral to finally claim that
Also look at the closely related Exercises 2.3.3-2.3.4. !
Example 2.3.4 Suppose that Z is the standard normal variable. How should
|Z|
one directly derive the expression for E(e )? Here, the mgf of Z from (2.3.14)
is not going to be of much help. Let us apply the Definition 2.2.3 and proceed
as follows:
Look at the related Exercise 2.3.16 where we ask for the mgf of |Z|. !
2.3.4 The Gamma Distribution
Let us suppose that a random variable X has the Gamma(α, β) distribution
with its pdf where 0 < x < ∞ and (α, β) ∈ ℜ +
× ℜ , given by (1.7.20). Now, let us denote β* = β(1 βt) for all t < β so
1
1
+
that β* is positive. Thus, we can express M (t) as
X
Observe that the function h(u) used in the last step in (2.3.22) resembles the
pdf of a random variable having the Gamma(α, β*) distribution for u ∈ Β +
where β* is positive. So, we must have In other words,
(2.3.22) leads to the following conclusion.
Now, log(M (t)) = αlog(1 βt) so that one can immediately have dM (t)/
X
X
1
dt = αβ(1 βt) M (t). Hence, dM (t)/dt, when evaluated at t = 0, re-
X
X
duces to αβ because M (0) = 1. Next, we use the chain rule of
X
2
2
2
2
differentiation in order to write d M (t)/dt = α(1 + α)β (1 βt) M (t)
X X