Page 110 - Fundamentals of Probability and Statistics for Engineers
P. 110
Expectations and Moments 93
the random column vector with components X 1 ,..., X n , and let the means of
X 1 ,..., X n be represented by the vector m X . A convenient representation of
L
their variances and covariances is the covariance matrix, , defined by
T
L Ef
X m X
X m X g;
4:34
L
where the superscript T denotes the matrix transpose. The n n matrix has
a structure in which the diagonal elements are the variances and in which the
nondiagonal elements are covariances. Specifically, it is given by
var
X 1 cov
X 1 ; X 2 ... cov
X 1 ; X n
2 3
6 cov
X 2 ; X 1 var
X 2 ... cov
X 2 ; X n 7
6 7
L 6 . . . . 7 :
4:35
6 . . . . 7
4 . . . . 5
cov
X n ; X 1 cov
X n ; X 2 ... var
X n
In the above ‘var’ reads ‘variance of’ and ‘cov’ reads ‘covariance of’. Since
cov X i , X j ) cov X j , X i ), the covariance matrix is always symmetrical.
In closing, let us state (in Theorem 4.2) without proof an important result
which is a direct extension of Equation (4.28).
Theorem 4.2: if X 1 , X 2 ,..., X n are mutually independent, then
Efg
X 1 g
X 2 ... g
X n g Efg
X 1 gEfg
X 2 g ... Efg
X n g;
4:36
1 2 n 1 2 n
where g (X j ) is an arbitrary function of X j . It is assumed, of course, that all
j
indicated expectations exist.
4.4 MOMENTS OF SUMS OF RANDOM VARIABLES
Let X 1 , X 2 ,..., X n be n random variables. Their sum is also a random variable.
In this section, we are interested in the moments of this sum in terms of
those associated with X j , j 1, 2, . . . , n. These relations find applications
in a large number of derivations to follow and in a variety of physical
situations.
Consider
Y X 1 X 2 X n :
4:37
Let m j and 2 j denote the respective mean and variance of X j . Results 4.1–4.3
are some of the important results concerning the mean and variance of Y .
TLFeBOOK