Page 139 - Elements of Distribution Theory
P. 139
P1: JZP
052184472Xc04 CUNY148/Severini May 24, 2005 2:39
4.6 Conditional Moments and Cumulants 125
substituting the random variable Y for y yields the conditional moments and cumulants of
X given Y.In this section, we consider the relationship between the conditional moments
and cumulants of X given Y and the unconditional moments and cumulants of X.Aswe
will see, this is one area in which it is easier to work with moments than cumulants.
Suppose that E(|X|) < ∞.Wehave seen (Theorem 2.5) that E(X) = E[E(X|Y)]. The
same result holds for any moment of X, provided that it exists. Suppose that, for some r,
r
r
r
E(|X| ) < ∞; then E(X ) = E[E(X |Y)].
Now consider cumulants; for simplicity, suppose that the cumulant-generating function
of X, K, and the conditional cumulant-generating function of X given Y = y, K(·, y), both
exist. Then, for any integer m = 1, 2,...,
m j
t m
K(t) = κ j + o(t )as t → 0
j!
j=1
where κ 1 ,κ 2 ,... denote the (unconditional) cumulants of X. Similarly,
m j
t m
K(t, y) = κ j (y) + o(t )as t → 0
j!
j=1
where κ 1 (y),κ 2 (y),... denote the conditional cumulants of X given Y = y. The conditional
cumulants of X given Y are then given by κ 1 (Y),κ 2 (Y),.... Given the indirect way in which
cumulants are defined, the relationship between conditional and unconditional cumulants
is not as simple as the relationship between conditional and unconditional moments.
For the low-order cumulants, the simplest approach is to rewrite the cumulants in terms
of moments and use the relationship between conditional and unconditional moments. For
instance, since the first cumulant is simply the mean of the distribution, we have already
seen that
κ 1 = E[κ 1 (Y)].
For the second cumulant, the variance, note that
2
2
2
κ 2 = E(X ) − E(X) = E[E(X |Y)] − E[E(X|Y)] 2
2
2
2
= E[E(X |Y)] − E[E(X|Y) ] + E[E(X|Y) ] − E[E(X|Y)] 2
= E[Var(X|Y)] + Var[E(X|Y)].
We now consider a general approach that can be used to relate conditional and uncon-
ditional cumulants. The basic idea is that the conditional and unconditional cumulant-
generating functions are related by the fact that K(t) = log E[exp{K(t, Y)}]. As t → 0,
m
j m
K(t) = log E exp t κ j (Y)/j! + o(t ). (4.5)
j=1
Note that κ 1 (Y),...,κ m (Y) are random variables; let K m (t 1 ,..., t m ) denote the cumulant-
generating function of the random vector (κ 1 (Y),...,κ m (Y)). Then, by (4.5),
m
m
2
K(t) = log K m (t, t /2,..., t /m!) + o(t )as t → 0.