Page 126 - Fundamentals of Probability and Statistics for Engineers
P. 126

Expectations and Moments                                        109

           If random variables X  and Y  are independent, then we also have
                                    XY …t; s† ˆ   X …t†  Y …s†:          …4:86†

           To show the above, we simply substitute f (x)f  (y) for f  XY  (x, y) in Equation
                                               X

                                                    Y
           (4.83). The double integral on the right-hand side separates, and we have
                                   Z  1          Z  1
                                        jtx
                                                      jsy
                           XY …t; s†ˆ  e f …x†dx     e f …y†dy
                                                         Y
                                           X
                                     1             1
                                 ˆ   X …t†  Y …s†;
           and we have the desired result.
             Analogous to the one-random-variable case, joint characteristic function
             XY  (t, s) is often  called  on  to  determine joint  density function  f  XY  (x, y) of X
           and  Y   and  their  joint  moments.  The  density  function  f  XY  (x, y)  is  uniquely
           determined in terms of   XY  (t, s) by the two-dimensional Fourier transform


                                   1  Z  1  Z  1   j…tx‡sy†
                        f  …x; y†ˆ            e        XY …t; s†dtds;    …4:87†
                         XY          2
                                  4     1   1
                           n
                             m
           and moments EfX Y gˆ   nm ,  if they exist, are related to   XY  (t, s) by
                  q n‡m               n‡m  Z  1  Z  1  n m
                         XY …t; s†    ˆ j        x y f  XY …x; y†dxdy
                   n
                  qt qs m
                               t;sˆ0       1   1                        …4:88†
                                   ˆ j n‡m   nm :

           The MacLaurin series expansion of   XY  (t, s) thus takes the form

                                         1  1
                                        X X      ik  i  k
                                XY …t; s†ˆ        … jt† … js† :          …4:89†
                                               i!k!
                                         iˆ0 kˆ0
           The above development can be generalized to the case of more than two
           random variables in an obvious manner.
             Example 4.18. Let us consider again the Brownian motion problem discussed
                                                       0
                                                              0
           in Example 4.17, and form two random variables X and Y as
                                0
                              X ˆ X 1 ‡ X 2 ‡     ‡ X 2n ;  )
                                                                         …4:90†
                                0
                              Y ˆ X n‡1 ‡ X n‡2 ‡     ‡ X 3n :







                                                                            TLFeBOOK
   121   122   123   124   125   126   127   128   129   130   131