Page 124 - Elements of Distribution Theory
P. 124

P1: JZP
            052184472Xc04  CUNY148/Severini  May 24, 2005  2:39





                            110                        Moments and Cumulants

                            as in the case of a real-valued random variable, δ is known as the radius of convergence
                            of M X .
                              Many of the properties of a moment-generating function for a real-valued random vari-
                            able extend to the vector case. Several of these properties are given in the following theorem;
                            the proof is left as an exercise.


                            Theorem 4.12. Let X and Y denote d-dimensional random vectors with moment-generating
                            functions M X and M Y , and radii of convergence δ X and δ Y ,respectively.
                                                                                                  d
                               (i) Let A denote an m × d matrix of real numbers and let b denote an element of R .
                                  Then M AX+b , the moment-generating function of AX + b, satisfies
                                                                T       T
                                                 M AX+b (t) = exp{t b}M X (A t), ||t|| <δ A
                                  for some δ A > 0, possibly depending on A.
                               (ii) If X and Y are independent then M X+Y , the moment-generating function of X + Y,
                                  exists and is given by
                                                M X+Y (t) = M X (t)M Y (t), ||t|| < min(δ X ,δ Y ).

                              (iii) X and Y have the same distribution if and only if there exists a δ> 0 such that
                                                    M X (t) = M Y (t)  for all ||t|| <δ.

                              Asisthecasewiththecharacteristicfunction,thefollowingresultshowsthatthemoment-
                            generating function can be used to establish the independence of two random vectors; the
                            proof is left as an exercise.

                                                                                 d
                            Corollary 4.2. Let X denote a random vector taking values in R and let X = (X 1 , X 2 )
                            where X 1 takes values in R d 1  and X 2 takes values in R . Let M denote the moment-
                                                                           d 2
                            generating function of X with radius of convergence δ, let M 1 denote the moment-generating
                            function of X 1 with radius of convergence δ 1 , and let M 2 denote the moment-generating
                            function of X 2 with radius of convergence δ 2 .
                            X 1 and X 2 are independent if and only if there exists a δ 0 > 0 such that for all t = (t 1 , t 2 ),
                            t 1 ∈ R ,t 2 ∈ R , ||t|| <δ 0 ,
                                        d 2
                                 d 1
                                                       M(t) = M 1 (t 1 )M(t 2 ).



                                                        4.4 Cumulants
                            Although moments provide a convenient summary of the properties of a random variable,
                            they are not always easy to work with. For instance, let X denote a real-valued random
                            variable and let a, b denote constants. Then the relationship between the moments of X and
                            those of aX + b can be quite complicated. Similarly, if Y is a real-valued random variable
                            such that X and Y are independent, then the moments of X + Y do not have a simple
                            relationship to the moments of X and Y.
                              Suppose that X and Y have moment-generating functions M X and M Y , respectively.
                            Some insight into the properties described above can be gained by viewing moments of
                            a random variable as derivatives of its moment-generating function at 0, rather than as
                            integrals with respect to a distribution function. Since the moment-generating function of
   119   120   121   122   123   124   125   126   127   128   129