Page 362 - Fundamentals of Probability and Statistics for Engineers
P. 362
Linear Models and Linear Regression 345
This completes the proof. The theorem stated above is a special case of the
Gauss–Markov theorem.
Another interesting comparison is that between the least-square estimators
for and and their maximum likelihood estimators with an assigned dis-
tribution for random variable Y . It is left as an exercise to show that the
maximum likelihood estimators for and are identical to their least-square
counterparts under the added assumption that Y is normally distributed.
11.1.3 UNBIASED ESTIMATOR FOR 2
As we have shown, the method of least squares does not lead to an estimator
for variance 2 of Y , which is in general also an unknown quantity in linear
regression models. In order to propose an estimator for 2 , an intuitive choice is
n
^
^
X 2
c 2
k Y i
A Bx i ;
11:28
i1
where coefficient k is to be chosen so that 2 is unbiased. In order to carry out
c
c 2
the expectation of , we note that [see Equation (11.7)]
^ ^ ^ ^
Y i A Bx i Y i
Y Bx Bx i
11:29
^
Y i Y B
x i x:
Hence, it follows that
n n n
^
X 2 X 2 X 2
^
Y i A Bx i
Y i Y B ^ 2
x i x ;
11:30
i1 i1 i1
since [see Equation (11.8)]
n n
X ^ X 2
x i x
Y i Y B
x i x :
11:31
i1 i1
Upon taking expectations term by term, we can show that
)
n n
X 2 X 2
c 2
Ef g kE
Y i Y B ^ 2
x i x
i1 i1
2
k
n 2 :
TLFeBOOK