Page 80 - Computational Statistics Handbook with MATLAB
P. 80
66 Computational Statistics Handbook with MATLAB
These are the sample moments about the sample mean, and it can be verified
that these solutions jointly maximize the likelihood function [Lindgren,
1993].
We know that the EX[] = µ [Mood, Graybill and Boes, 1974], so the sam-
ple mean is an unbiased estimator for the population mean. However, that is
not the case for the maximum likelihood estimate for the variance. It can be
shown [Hogg and Craig, 1978] that
1)σ
n –
ˆ 2
[
E σ ] = ( ----------------------- 2 ,
n
so we know (from Equation 3.14) that the maximum likelihood estimate, σ ,
ˆ 2
for the variance is biased. If we want to obtain an unbiased estimator for the
variance, we simply multiply our maximum likelihood estimator by
⁄
n ( n – 1) . This yields the familiar statistic for the sample variance given by
n
1 2
2
s = ------------ ∑ ( x i – x) .
n – 1
i = 1
Methodof of M M oments s ss
Method
oment
Methodofof
Method
MM omentoment
In some cases, it is difficult finding the maximum of the likelihood function.
For example, the gamma distribution has the unknown parameter t that is
used in the gamma function, Γ t() . This makes it hard to take derivatives and
solve the equations for the unknown parameters. The method of moments is
one way to approach this problem.
In general, we write the unknown population parameters in terms of the
population moments. We then replace the population moments with the cor-
responding sample moments. We illustrate these concepts in the next exam-
ple, where we find estimates for the parameters of the gamma distribution.
Example 3.4
λ
The gamma distribution has two parameters, t and . Recall that the mean
and variance are given by t λ⁄ and t λ ⁄ 2 , respectively. Writing these in terms
of the population moments, we have
t
EX[] = --- , (3.29)
λ
and
© 2002 by Chapman & Hall/CRC