Page 139 - Classification Parameter Estimation & State Estimation An Engg Approach Using MATLAB
P. 139

128                                            STATE ESTIMATION

            hmmtrain:       Given a sequence of observations, estimate P t ( j )
                            and P z ( j ).
            hmmviterbi:     Given P t ( j ), P z ( j ) and a sequence of observa-
                            tions, calculate the most likely state sequence.

              The function hmmtrain() implements the so-called Baum–Welch
            algorithm.


            4.4   MIXED STATES AND THE PARTICLE FILTER


            Sections 4.2 and 4.3 focused on special cases of the general online estimation
            problem. The topic of Section 4.2 was continuous state estimation, and in
            particular the linear-Gaussian case or approximations of that. Section 4.3
            discussed discrete state estimation. We now return to the general scheme of
            Figure 4.2. The current section introduces the family of particle filters (PF). It
            is a group of estimators that try to implement the general case. These
            estimators use random samples to represent the probability densities, just
            as in the case of Parzen estimation; see Section 5.3.1. As such, particle filters
            are able to handle nonlinear, non-Gaussian systems, continuous states, dis-
            crete states and even combinations. In the sequel we use probability densities
            (which in the discrete case must be replaced by probability functions).



            4.4.1  Importance sampling

            A Monte Carlo simulation uses a set of random samples generated from
            a known distribution to estimate the expectation of any function of that
                                             (k)
            distribution. More specifically, let x , k ¼ 1, ... , K be samples drawn
            from a conditional probability density p(xjz). Then, the expectation of
            any function g(x) can be estimated by:

                                              K
                                            1  X
                                 E½gðxÞjzŠffi      gx ðkÞ                ð4:71Þ
                                           K
                                             k¼1
            Under mild conditions, the right-hand side asymptotically approximates
            the expectation as K increases. For instance, the conditional expectation
            and covariance matrix are found by substitution of g(x) ¼ x and
                               T
                             ^
                       ^
                       x
                             x
            g(x) ¼ (x   x)(x   x) , respectively.
              In the particle filter, the set x (k)  depends on the time index i.It
            represents the posterior density p(x(i)jZ(i)). The samples are called the
   134   135   136   137   138   139   140   141   142   143   144