Page 126 - Fundamentals of Probability and Statistics for Engineers
P. 126
Expectations and Moments 109
If random variables X and Y are independent, then we also have
XY
t; s X
t Y
s:
4:86
To show the above, we simply substitute f (x)f (y) for f XY (x, y) in Equation
X
Y
(4.83). The double integral on the right-hand side separates, and we have
Z 1 Z 1
jtx
jsy
XY
t; s e f
xdx e f
ydy
Y
X
1 1
X
t Y
s;
and we have the desired result.
Analogous to the one-random-variable case, joint characteristic function
XY (t, s) is often called on to determine joint density function f XY (x, y) of X
and Y and their joint moments. The density function f XY (x, y) is uniquely
determined in terms of XY (t, s) by the two-dimensional Fourier transform
1 Z 1 Z 1 j
txsy
f
x; y e XY
t; sdtds;
4:87
XY 2
4 1 1
n
m
and moments EfX Y g nm , if they exist, are related to XY (t, s) by
q nm nm Z 1 Z 1 n m
XY
t; s j x y f XY
x; ydxdy
n
qt qs m
t;s0 1 1
4:88
j nm nm :
The MacLaurin series expansion of XY (t, s) thus takes the form
1 1
X X ik i k
XY
t; s
jt
js :
4:89
i!k!
i0 k0
The above development can be generalized to the case of more than two
random variables in an obvious manner.
Example 4.18. Let us consider again the Brownian motion problem discussed
0
0
in Example 4.17, and form two random variables X and Y as
0
X X 1 X 2 X 2n ; )
4:90
0
Y X n1 X n2 X 3n :
TLFeBOOK