Page 306 - Fundamentals of Probability and Statistics for Engineers
P. 306
Parameter Estimation 289
To see that this procedure is plausible, we observe that the quantity
L
x 1 ; x 2 ; ... ; x n ; dx 1 dx 2 dx n
is the probability that sample X 1 , X 2 ,... , X n takes values in the region defined
by (x 1 dx 1 , x 2 dx 2 , ... , x n dx n ). Given the sample values, this probability
gives a measure of likelihood that they are from the population. By choosing
a value of that maximizes L, or ln L, we in fact say that we prefer the value of
that makes as probable as possible the event that the sample values indeed
come from the population.
The extension to the case of several parameters is straightforward. In the case
of m parameters, the likelihood function becomes
L
x 1 ; ... ; x n ; 1 ; ... ; m ;
and the MLEs of j , j 1, .. ., m, are obtained by solving simultaneously the
system of likelihood equations
q ln L 0; j 1; 2; ... ; m:
9:100
^
q j
A discussion of some of the important properties associated with a maximum
likelihood estimator is now in order. Let us represent the solution of the like-
lihood equation, Equation (9.99), by
^
h
x 1 ; x 2 ; ... ; x n :
9:101
^
The maximum likelihood estimator for is then
^
h
X 1 ; X 2 ; ... ; X n :
9:102
The universal appeal enjoyed by maximum likelihood estimators stems from
the optimal properties they possess when the sample size becomes large. Under
mild regularity conditions imposed on the pdf or pmf of population X, two
notable properties are given below, without proof.
^
Property 9.1: consistency and asymptotic efficiency. Let be the maximum
likelihood estimator for in pdf f (x; ) on the basis of a sample of size n. Then,
as n !1,
^
Ef g! ;
9:103
and
1
))
q ln f
X; 2
^
varf g! nE :
9:104
q
TLFeBOOK