Page 403 - Introduction to AI Robotics
P. 403

386
                                                                                  Localization and Map Making
                                                                               11
                                                                                       1 becomes the prior
                                     can be used iteratively where the probability at time t n
                                     and is combined with the current observation (t n ).
                                       To see this, consider the following derivation. For n multiple observations,
                                                         :
                                                                 :
                                     s 1 ;  2 ;  n s, Bayes’ rule becomes:  s
                                                 :
                                                                    P (s 1 ;  2 ;  n sjH :)P (H)  :  :    s
                             (11.5)  P (Hjs 1 ;  2 ;  n s)  :  :      : =    s
                                                                                          s
                                                        P (s 1 ;  2 ;  n sjH :)P (H)  :P (s 1 ;  2 : ;  n +j:  s : H)P (:H :)  :  s
                                       This introduces the problem of generating P (s 1 ;  2 ;  n sjH :). Ideally, this  :  s
                                                                                                   :
                                                                                                        ]
                                     requires a sonar model of getting occupied and empty values for all g  r[i][ i d
                                                                                                       j
                                     with n combinations of sensor readings. Fortunately, if the reading from s 1
                                     can be considered the result of a different experiment than s 2 and the others,
                                                    :)
                                                                   :
                                                           :
                                     P (s 1 ;  2 ;  n sjH simplifies to P (s 1 jH)P (s 2 jH) :  (s n  : jH). Now, the pro-
                                                                                               :
                                                                                                       P
                                                                           s
                                     gram only has to remember all previous n  1 readings. Since there is no way
                                     of predicting how many times a particular grid element will be sensed, this
                                     creates quite a programming problem. The occupancy grid goes from being
                                     a two dimensional array with a single two field structure to being a two di-
                                     mensional array with each element a potentially very long linked list. Plus,
                                     whereas Eqn. 11.3 involved 3 multiplications, updating now takes 3( n  1)
                                     multiplications. The computational overhead begins to be considerable since
                                     an element in a hallway may have over 100 observations.
                                       Fortunately, by clever use of P (Hjs)P (s)  P (sjH)P (H), a recursive ver-
                                                                                                =
                                     sion of Bayes’ rule can be derived:
                                                           P (s n jH)P (Hjs n  1 )
                             (11.6)  P (Hjs n )               =
                                                                              +Hjs n
                                                P (s n jH)P (Hjs n  1 )  P (s n j: H)P (:  1 )
                                       So at each time a new observation is made, Eqn. 11.6 can be employed and
                                                            j
                                     the result stored at g  r[i][ i dThe rule is commutative, so it doesn’t matter in
                                                              .
                                                             ]
                                     what order two or more simultaneous readings are processed.
                              11.4   Dempster-Shafer Theory
                                     An alternative theory of evidence is Dempster-Shafer theory which produces
                                     results similar to Bayesian probabilities. It is a much newer theory, origi-
                                     nating in the work of A.P. Dempster, a mathematician at Harvard, during
                                     the 1960’s with extensions by Glen Shafer in 1987. 126  Whereas Bayes’ rule
                                     relies on evidence being represented by probability functions, Dempster-
                                     Shafer theory represents evidence as a possibilistic belief function. Possibilis-
                                     tic means that the function represents partial evidence. For example, a reading
   398   399   400   401   402   403   404   405   406   407   408