Page 155 - Elements of Distribution Theory
P. 155
P1: JZP
052184472Xc05 CUNY148/Severini May 24, 2005 17:53
5.3 Exponential Family Models 141
Proof. The proof is given for the case in which the distribution is absolutely continuous.
Let M denote the moment-generating function of T (Y). Then
T
M(t) = E[exp{t T (Y)}]
T
T
= exp{t T (y)} exp{η T (y) − k(η)}h(y) dy
Y
T
= exp{(t + η) T (y) − k(η)}h(y) dy.
Y
For sufficiently small t , t + η ∈ H. Then, by definition of the function k,
M(t) = exp{k(t + η) − k(η)},
proving the result.
Example 5.12 (Poisson distributions). Consider the family of Poisson distributions
described in Example 5.10. Recall that the model function is given by
x
p(x; λ) = λ exp(−λ)/x!, x = 0, 1,...
which can be written
1
exp{x log(λ) − λ} , x ∈{0, 1, 2,...}.
x!
Hence, the natural parameter is η = log(λ), the natural parameter space is H = R, and
the cumulant function is k(η) = exp(η). It follows that the cumulant-generating function
of X is
exp(t + η) − exp(η), t ∈ R.
In terms of the original parameter λ, this can be written
λ[exp(t) − 1], t ∈ R.
Example 5.13 (Exponential distributions). Consider the family of exponential distribu-
tions described in Example 5.11. Recall that the model function is given by
1
exp(−y/θ), y > 0,
θ
where θ> 0; this may be written
exp{ηy + log(−η)}, y > 0
where −∞ <η < 0. It follows that the cumulant-generating function of Y is given by
log(t − η) − log(−η), |t| < −η.
In terms of the original parameter θ, this can be written
1
log(θt + 1), |t| < .
θ