Page 266 - Schaum's Outlines - Probability, Random Variables And Random Processes
P. 266

CHAP.  71                       ESTIMATION  THEORY



              Clearly the m.s. error e depends on c, and it is minimum if








              Thus, we conclude that the mas. estimate c of  Y is given by





              Find the m.s. estimator of a r.v. Y by a function g(X) of the r.v.  X.
                  By Eq. (7.17), the m.s. error is




              Since  f (x, y) = f (y I x) f (x), we can write



              Since the integrands above are positive, the m.s. error e is minimum if the inner integrand,




              is minimum for every x. Comparing  Eq. (7.58) with Eq. (7.55) (Prob. 7.16), we  see that they are the same
              form if  c is changed to g(x) and  f (y) is changed to f (y 1 x). Thus, by  the result of  Prob. 7.16 [Eq. (7.56)], we
              conclude that the m.s. estimate of  Y is given by



              Hence, the m.s. estimator of  Y is




         7.18.  Find the m.s. error if g(x) = E(Y ( x) is the m.s. estimate of  Y,
                  As we  see from Eq. (3.58), the conditional mean E(Y I x) of  Y, given that X = x, is a function of  x, and
              by Eq. (4.39),


              SimilarIy, the conditional mean E[g(X,  Y) I x]  of  g(X, Y), given that  X  = x, is a function of  x. It defines,
              therefore, the function E[g(X,  Y) I XI of the r.v. X. Then










              Note that Eq. (7.62) is the generalization of Eq. (7.61). Next, we note that
   261   262   263   264   265   266   267   268   269   270   271