Page 256 - Probability, Random Variables and Random Processes
P. 256

ESTIMATION  THEORY                           [CHAP  7






               The estimator OM, = s(X,,  . . . , X,)  is said to be a most  eficient  (or minimum  variance) unbiased
           estimator of the parameter 8 if
            1.  It is an unbiased estimator of 8.
            2.  Var(O,,)   5 Var(O) for all O.


          C.  Consistent Estimators:
               The estimator On of 8 based on a random sample of size n is said to be consistent if for any small
            E  > 0,
                                           lim  P(I0, - 81 < e) = 1                        (7.5)
                                           n+  m
            or equivalently,

                                           lim  P(IOn-812e)=0
                                           n+  m
            The following two conditions are sufficient to define consistency (Prob. 7.5):

            1.  lim  E(O,) = 8
               n+  m
            2.  lim  Var(O,) = 0
               n+  00


          7.4  MAXIMUM-LIKELIHOOD ESTIMATION
               Let f (x; 8) = f (x,, . . . , x,;  8) denote the joint  pmf of  the r.v.'s  X,, . . . , X,  when they are discrete,
            and let it be their joint pdf when they are continuous. Let

                                        L(8) = f(x; 8) = f(x,, ..., x,;  8)                (7.9)
            Now L(8) represents the likelihood that the values x,, . . . , x,  will be observed when 8 is the true value
            of  the parameter. Thus L(8) is often referred to as the likelihood function  of  the random sample. Let
            OM,  = s(xl, . . . , x,) be the maximizing value of L(8); that is,
                                              L(0,,)   = max L(8)
                                                       e
            Then the maximum-likelihood estimator of 0 is
                                             OML  = s(X1, . . ., x,)
            and OM,  is the maximum-likelihood estimate of 8.
               Since L(8) is a product of either pmf s or pdf s, it will always be positive (for the range of  possible
            value of  8). Thus In  L(8) can always be  defined, and in determining the maximizing value of  8, it is
            often useful to use the fact that L(8) and In L(8) have their maximum at the same value of  8. Hence,
            we may also obtain OM,  by maximizing In L(8).


          7.5  BAYES'  ESTIMATION
               Suppose that the unknown parameter  8 is considered to be  a r.v.  having some fixed distribution
            or prior pdf f (8). Then f (x; 8) is now viewed as a conditional pdf  and written as f (x 1 8), and we  can
            express the joint pdf of the random sample (XI, . . . , X,)  and 8 as
   251   252   253   254   255   256   257   258   259   260   261