Page 283 -
P. 283

6.4 Structural Matching   271


                             Improvement  of  the  basic  algorithm  in  order  to  achieve  flexible  structural
                          matching, while coping with the issues of scale and orientation changes, as well as
                          contour gaps provoked by occluding objects, is described in the work of Gdalyahu
                          and Weinshall (1999).
                             The basic ideas of  string matching (minimization of  a cost function dependent
                          on edit operations) can be generalized and applied to tree or graph matching.


                          6.4.2  Probabilistic Relaxation Matching

                          The  string  matching  procedure  described  in  the  previous  section,  performs  a
                          sequence of edit operations in order to obtain a match of a string x with a string y.
                          In  some cases,  performing the  matching sequentially may  not  be  the  best  idea,
                          since early decisions may completely impair the whole process. In order to see this,
                          let us again consider matching the prototype shape to shapes (b) and (d) in Figure
                          6.18. The perfect matching of the first two segments of the prototype with shape
                          (d)  results  in  a  better  matching  score  than  with  shape  (b)  for  a  not  too  high
                          substitution cost. The situation may worsen when there are many primitives, and
                          complex  relations  among  them.  In  such  situations,  a  parallel  approach  that
                          iteratively  adjusts  the  matching  at  several  points  using  local  constraints  may
                          produce better results.
                             Relaxation labelling is  a class of iterative algorithms that  works in a parallel
                           way, dealing with  several constraints at the same time. In  what follows we  will
                           describe  the  two  main  methods  of  performing  relaxation,  probabilistic  and
                           discrete,  and  we  will  also  present  a  neural  net  approach  implementing  the
                           relaxation.
                             Let us assume that there are n objects Ai that we want to label with m labels lk.
                           Suppose also that the label assignments of  the objects are interdependent, i.e., the
                           assignments Ai + 4 and Ah -+ Ik have some measure of compatibility c(i:j; h:k) that
                           varies  in  the  interval  1-1,  11,  with  1  meaning  full  compatibility,  -1  full
                           incompatibility and 0 a "don't  care" situation. Our goal is to obtain a compatible
                           assignment  of  labels  to  objects,  measured  by  a  probability  estimate,  PU, of
                           assigning  4 to Ai. For this purpose, we have to consider how  we can iteratively
                           update our probability estimates in order to obtain a final assignment, starting from
                           an initial situation where the probability of the assignment Ai + 4 is <I").
                             The probabilistic relaaiion method updates the probabilities of the assignment
                           Ai + I,  by  noticing  first  that, given  the  assignment Ah -+  lk, one would  like to
                           increment PU with c(i:j; h:k)Phk since one wants to increase PU proportionally to Phk
                           if the compatibility is high and, in the same way, decrease it if the compatibility is
                           low. One way to combine, at each iteration r, the increments for all objects Ah and
                           labels lk, is to add up the increments for all possible labels as:





                           and average these increments for all objects xh f xi:
   278   279   280   281   282   283   284   285   286   287   288