Page 280 - Applied statistics and probability for engineers
P. 280

258   Chapter 7/Point Estimation of Parameters and Sampling Distributions



               Example 7-9     Gamma Distribution Moment Estimators  Suppose that X , X , …  , X n  is a random sample from
                                                                                  2
                                                                                1
                               a gamma distribution with parameters r  and λ. For the gamma distribution, E X ( ) =  r / λ  and
               E X ( ) = (  +  /  2
                   2
                      r r 1) λ . The moment estimators are found by solving
                                                                       n
                                                          (
                                                                     1
                                                r / λ =  X, r r + ) λ = ∑  X i
                                                                  2
                                                                          2
                                                               /
                                                             1
                                                                     n i  =1
               The resulting estimators are
                                                    X  2                 X
                                                                ˆ
                                           ˆ r =    n           λ =      n
                                                                            2
                                                       2
                                              (1 / n)∑  X i −  X 2  (1 / n)∑ X i −  X  2
                                                   i=1                  i=1
               To illustrate, consider the time to failure data introduced following Example 7-7. For these data, x = 21 .65  and
                8
               ∑ =i x 2 i  = 6639 .40, so the moment estimates are
                 1
                                          . )
                                                                ˆ
                              ˆ r =    (21 65  2    2  = . 1 29 ,  λ =     21 .65     2  = .0598
                                                                                         0
                                                                       )
                                                . )
                                     )
                                                                                   . )
                                                                              −
                                   / (1 8 6645 .43  −(21 65          / (1 8 6645 .43 −(21 65
                 Interpretation: When r = 1, the gamma reduces to the exponential distribution. Because ˆ r slightly exceeds unity, it
               is quite possible that either the gamma or the exponential distribution would provide a reasonable model for the data.
               7-4.2  Method of Maximum Likelihood
                                   One of the best methods of obtaining a point estimator of a parameter is the method of maxi-
                                   mum likelihood. This technique was developed in the 1920s by a famous British statistician,
                                   Sir R. A. Fisher. As the name implies, the estimator will be the value of the parameter that
                                   maximizes the likelihood function.

               Maximum Likelihood
                                                                                                θ
                         Estimator    Suppose that X is a random variable with probability distribution  f x; ) where θ
                                                                                              (
                                      is a single unknown parameter. Let x , x , …  , x n  be the observed values in a random
                                                                    1
                                                                      2
                                      sample of size n. Then the likelihood function of the sample is
                                                                                f x n  θ)
                                                              f x
                                                        L θ ( ) = ( 1 ; θ) ⋅  f x ( 2 ; θ)⋅⋅⋅⋅⋅ ( ;  (7-10)
                                      Note that the likelihood function is now a function of only the unknown parameter θ.
                                      The maximum likelihood estimator (MLE) of θ is the value of θ that maximizes
                                      the likelihood function L( )θ .



                                     In the case of a discrete random variable, the interpretation of the likelihood function is
                                   simple. The likelihood function of the sample L( )θ  is just the probability
                                                          (
                                                         P X 1 =  x , X 2  =  x , , X n2 …  =  x n)
                                                                1
                                   That is, L( )θ  is just the probability of obtaining the sample values x x 2 ,…  x ,  n . Therefore,
                                                                                           1 ,
                                   in the discrete case, the maximum likelihood estimator is an estimator that maximizes the
                                   probability of occurrence of the sample values. Maximum likelihood estimators are generally
                                   preferable to moment estimators because they possess good efi ciency properties.
   275   276   277   278   279   280   281   282   283   284   285