Page 91 - Elements of Distribution Theory
P. 91

P1: JZP
            052184472Xc03  CUNY148/Severini  May 24, 2005  2:34





                                                    3.2 Basic Properties                      77

                        Example 3.7 (Chi-squared distribution). Let Z denote a random variable with a standard
                                                                  2
                        normal distribution and consider the distribution of Z . This distribution has characteristic
                        function
                              ∞           1         1           1     ∞       1

                                      2               2                                2
                                exp(itz )√   exp − z     dz = √         exp − (1 − 2it)z  dz
                                         (2π)       2          (2π)           2
                             −∞                                      −∞
                                                                 1
                                                           =          .
                                                                     1
                                                              (1 − 2it) 2
                          Now consider independent standard normal random variables Z 1 , Z 2 , ··· , Z n and let
                              2
                                       2
                        X = Z +· · · + Z .By Theorem 3.4, the characteristic function of X is
                              1
                                       n
                                                              1
                                                    ϕ(t) =        n .
                                                          (1 − 2it) 2
                        Comparing this result to the characteristic function derived in Example 3.4 shows that X
                        has a gamma distribution with parameters α = n/2 and β = 1/2. This special case of the
                        gamma distribution is called the chi-squared distribution with n degrees of freedom; note
                        that this distribution is defined for any positive value of n, not just integer values.
                        Example 3.8 (Sum of binomial random variables). Let X 1 and X 2 denote independent
                        random variables such that, for j = 1, 2, X j has a binomial distribution with parameters n j
                        and θ j . Recall from Example 3.3 that X j has characteristic function

                                                                  n j
                                          ϕ j (t) = [1 − θ j + θ j exp(it)] ,  j = 1, 2.
                        Let X = X 1 + X 2 . Then X has characteristic function

                                       ϕ(t) = [1 − θ 1 + θ 1 exp(it)] [1 − θ 2 + θ 2 exp(it)] .
                                                             n 1
                                                                               n 2
                        Hence, if θ 1 = θ 2 , then X also has a binomial distribution.
                        An expansion for characteristic functions
                        It is well known that the exponential function of a real-valued argument, exp(x), can be
                        expanded in a power series in x:
                                                             ∞   j
                                                                x
                                                     exp(x) =     .
                                                                 j!
                                                             j=0
                        The same result holds for complex arguments, so that
                                                             ∞      j
                                                               (itx)
                                                   exp(itx) =       ;
                                                                 j!
                                                            j=0
                        see Appendix 2 for further discussion. Thus, the characteristic function of a random variable
                        X can be expanded in power series whose coefficients involve expected values of the form
                           m
                        E(X ), m = 0, 1,....
                                                                       m
                          This fact can be used to show that the existence of E(X ), m = 1, 2,..., is related to
                                                                                  m
                        the smoothness of the characteristic function at 0; in particular, if E(|X| ) < ∞, then ϕ X
                        is m-times differentiable at 0. The converse to this result is also useful, but it applies only
                        to moments, and derivatives, of even order. Specifically, if ϕ X is 2m-times differentiable at
                        0 then E(X 2m ) < ∞. The details are given in the following theorem.
   86   87   88   89   90   91   92   93   94   95   96