Page 88 - Classification Parameter Estimation & State Estimation An Engg Approach Using MATLAB
P. 88

OVERVIEW OF THE FAMILY OF ESTIMATORS                          77

            Listing 3.3
            MATLAB code for polynomial regression.

            load levelsensor;                      % Load dataset (t,z)
            figure; clf; plot(t,z,‘k.’); hold on;  % Plot it
            y ¼ 0:0.2:30; M ¼ [1210]; plotstring ¼ {‘k--’,‘k-’,‘k:’};
            for m ¼ 1:3
             p ¼ polyfit(t,z,M(m)-1);              % Fit polynomial
             z_hat ¼ polyval(p,y);                 % Calculate plot points
             plot(y,z_hat,plotstring{m});          % and plot them
            end;
            axis([0 30  0.3 0.2]);





            3.4   OVERVIEW OF THE FAMILY OF ESTIMATORS

            The chapter concludes with the overview shown in Figure 3.13. Two
            main approaches have been discussed, the Bayes estimators and the
            fitting techniques. Both approaches are based on the minimization of
            an objective function. The difference is that with Bayes, the objective
            function is defined in the parameter domain, whereas with fitting tech-
            niques, the objective function is defined in the measurement domain.
            Another difference is that the Bayes approach has a probabilistic con-
            text, whereas the approach of fitting lacks such a context.
              Within the family of Bayes estimators we have discussed two estima-
            tors derived from two cost functions. The quadratic cost function leads
            to MMSE (minimum variance) estimation. The cost function is such that
            small errors are regarded as unimportant, while larger errors are con-
            sidered more and more serious. The solution is found as the conditional
            mean, i.e. the expectation of the posterior probability density. The
            estimator is unbiased.
              If the MMSE estimator is constrained to be linear, the solution can be
            expressed entirely in terms of first and second order moments. If, in
            addition, the sensory system is linear with additive, uncorrelated noise, a
            simple form of the estimator appears that is used in Kalman filtering.
            This form is sometimes referred to as the Kalman form.
              The other Bayes estimator is based on the uniform cost function. This
            cost function is such that the damage of small and large errors are
            equally weighted. It leads to MAP estimation. The solution appears to
            be the mode of the posterior probability density. The estimator is not
            guaranteed to be unbiased.
   83   84   85   86   87   88   89   90   91   92   93