Page 257 - Probability, Random Variables and Random Processes
P. 257
CHAP. 71 ESTIMATION THEORY
and the marginal pdf of the sample is given by
where R, is the range of the possible value of 8. The other conditional pdf,
is referred to as the posterior pdf of 8. Thus the prior pdf f(9) represents our information about 8
prior to the observation of the outcomes of XI, . . . , X,, and the posterior pdf f (8 I x,, . . . , x,) rep-
resents our information about 8 after having observed the sample.
The conditional mean of 8, defined by
is called the Bayes' estimate of 8, and
OB = E(8 I XI, . . . , X,)
is called the Bayes' estimator of 13.
7.6 MEAN SQUARE ESTIMATION
In this section, we deal with the second type of estimation problem-that is, estimating the value
of an inaccessible r.v. Y in terms of the observation of an accessible r.v. X. In general, the estimator P
of Y is given by a function of X, g(X). Then Y - P = Y - g(X) is called the estimation error, and
there is a cost associated with this error, C[Y - g(X)]. We are interested in finding the function g(X)
that minimizes this cost. When X and Y are continuous r.v.'s, the mean square (m.s.) error is often
used as the cost function,
It can be shown that the estimator of Y given by (Prob. 7.17),
is the best estimator in the sense that the m.s. error defined by Eq. (7.1 7) is a minimum.
7.7 LINEAR MEAN SQUARE ESTIMATION
Now consider the estimator P of Y given by
P = g(X) = ax + b
We would like to find the values of a and b such that the m.s. error defined by
e = E[(Y - a2] = E([Y - (aX + b)I2}
is minimum. We maintain that a and b must be such that (Prob. 7.20)
E{[Y - (ax + b)]X} = 0
and a and b are given by