Page 407 - Introduction to AI Robotics
P. 407
390
For Example 2: 11 Localization and Map Making
( R R r ) ) + (
ccupied
m(O ) = M occupied a x
2
( 10 6 ) 15 5 ) + (
= 10 15 0:98=0 :52
2
m(E ) m= 0:0 p t y
ccupied
m(dontknow ) = 1:0 m(O )
= 1 0:52= :48
0
Resulting in:
p
ccupied
B = e m(O l ) :52 ; (E m) =m :0; (dontknow m t = ) y :48 = 0 0 0
The belief function produced by the sonar model is now ready to be com-
bined with any other observations. It is the equivalent of P (Hjs).Recall that
the sonar model generated the probability of P (sjH) and Bayes’ rule had to
be applied with an assumption of P (H) P (:H) to conver =it P (Hjs).
t
11.4.3 Dempster’s rule of combination
In theory there are many possible ways to combine belief functions. The
most popular is the original rule, called Dempster’s rule of combination or the
orthogonal sum. Dempster’s rule treats combining two belief functions, B 1 e l
l
and B 2 , e as if they represented a physical intersection. Dempster’s rule is
very similar to Bayes’ rule in that it can be applied to any two belief functions
as long as they are independent. This means that the rule can be applied
to observations about l from overlapping sonars or readings made at two
different times.
Dempster’s rule of combination is notationally dense, so a graphical ex-
ample will be given first, followed by the formal mathematical expression.
j
]
Consider the case where two observations about g r[i][ i dneed to be com-
]
bined. Both observations believe that g r[i][ i dis in Region I. Suppose the
j
two belief functions are:
)
p
4
:
)
B 1 e = l m(O c c u p i e d ; (E m m = :0; (dontknow m t =) y :6 = 0 0 0
)
6
p
)
:
B 2 e = l m(O c c u p i e d ; (E m m = :0; (dontknow m t =) y :4 = 0 0 0
Fig. 11.6a shows that the two belief functions can be represented as a num-
berline of length 1.0 corresponding to the one quanta of belief mass. The