Page 100 - Probability and Statistical Inference
P. 100

2. Expectations of Functions of Random Variables  77

                           Then, we express E(X ) analogously as
                                              2




                                            2
                           In other words, E(X ) simplifies to







                                                 2
                                           2
                                                                            2
                           Hence, V(X) = E(X ) – E (X) = (α + 1)αβ  – α β  = αβ . In summary, for
                                                                    2 2
                                                               2
                           the random variable X distributed as Gamma(α, β), we have
                           2.3 The Moments and Moment Generating
                                Function

                           Start with a random variable X and consider a function of the random variable,
                           g(X). Suppose that f(x) is the pmf or pdf of X where x ∈ χ is the support of the
                           distribution of X. Now, recall the Definition 2.2.3 for the expected value of the
                           random variable g(X), denoted by E[g(X)].
                              We continue to write µ = E(X). When we specialize g(x) = x – µ, we obvi-
                           ously get E[g(X)] = 0. Next, if we let g(x) = (x – µ) , we get E[g(X)] = σ . Now
                                                                    2
                                                                                      2
                           these notions are further extended by considering two other special choices of
                                                  r
                           functions, namely, g(x) = x  or (x – µ)  for fixed r = 1, 2, ... .
                                                           r
                                                th
                              Definition 2.3.1 The r  moment of a random variable X, denoted by η , is
                                                                                        r
                           given by η  = E[X ], for fixed r = 1, 2, ... . The first moment η  is the mean or
                                          r
                                    r
                                                                              1
                           the expected value µ of the random variable X.
                              Definition 2.3.2 The r  central moment of a random variable X around its
                                                th
                           mean µ, denoted by µ , is given by µ  = E[(X – µ) ] with fixed r = 1, 2, ... .
                                                                    r
                                             r           r
                                 Recall that the first central moment µ  is zero and the second
                                                                 1
                                                                             2
                                    central moment µ  turns out to be the variance σ  of X,
                                                   2
                                             assuming that µ and σ  are finite.
                                                                2
                              Example 2.3.1 It is known that the infinite series    converges
                           if p > 1, and it diverges if p ≤ 1. Refer back to (1.6.12). With some fixed
   95   96   97   98   99   100   101   102   103   104   105