Page 110 - Elements of Distribution Theory
P. 110
P1: JZP
052184472Xc04 CUNY148/Severini May 24, 2005 2:39
96 Moments and Cumulants
Moments of random vectors
2
2
Let X and Y denote real-valued random variables such that E(X ) < ∞ and E(Y ) < ∞.
2
2
In addition to the individual moments of X and Y,E(X), E(Y), E(X ), E(Y ), and so on, we
may also consider the moments and central moments of the random vector (X, Y), which
are called the joint moments and joint central moments, respectively, of (X, Y); the terms
product moments and central product moments are also used.
r
s
The joint moment of (X, Y)of order (r, s)is defined to be E(X Y ), given that the
expected value exists. Similarly, if µ X = E(X) and µ Y = E(Y), the joint central moment
of order (r, s)is defined to be
r s
E[(X − µ X ) (Y − µ Y ) ].
The most commonly used joint moment or joint central moment is the central moment
of order (1, 1), generally known as the covariance of X and Y. The covariance of X and Y
will be denoted by Cov(X, Y) and is given by
Cov(X, Y) = E[(X − µ X )(Y − µ Y )] = E(XY) − µ X µ Y .
Note that the Cov(X, Y) = Cov(Y, X) and that Cov(X, X) = Var(X). It follows from The-
orem 2.1 that if X and Y are independent, then Cov(X, Y) = 0.
The covariance arises in a natural way in computing the variance of a sum of random
variables. The result is given in the following theorem, whose proof is left as an exercise.
2
Theorem 4.2. Let X and Y denote real-valued random variables such that E(X ) < ∞ and
2
E(Y ) < ∞. Then
Var(X + Y) = Var(X) + Var(Y) + 2Cov(X, Y)
and, for any real-valued constants a, b,
Cov(aX + b, Y) = a Cov(X, Y).
The results of Theorem 4.2 are easily extended to the case of several random variables;
again, the proof is left as an exercise.
Corollary 4.1. Let Y, X 1 ,..., X n denote real-valued random variables such that
2
E(Y ) < ∞, E X 2 j < ∞, j = 1,..., n.
Then
n n
Cov Y, X j = Cov(Y, X j )
j=1 j=1
and
n n
Var X j = Var(X j ) + 2 Cov(X i , X j ).
j=1 j=1 i< j