Page 304 - Introduction to AI Robotics
P. 304

287
                                      7.9 Interleaving Deliberation and Reactive Control
                                      matches in planning and reaction times are no longer a compelling reason to
                                      enforce a strict separation of deliberation and reaction. However, the soft-
                                      ware engineering reason for the partition remains: things which operate on
                                      symbols and global information should be in the deliberator; things which
                                      operate directly on sensors and actuators should be in the reactor.
                                        The above example of Navigation illustrates a top-down interleaving of
                                      deliberation and reaction. The deliberative layer(s) decompose the mission
                                      into finer and finer steps until it arrives at a set of behaviors capable of ac-
                                      complishing the first subgoal. Another example of interleaving deliberation
                                      and reaction is the role of the deliberative layer(s) in supplying the reactor
                                      with expectations and virtual sensors. And it is important to note that delib-
                                      eration can be triggered bottom-up as well.
                                        It is not always the case that the deliberator generates the set of behaviors,
                                      turns them on, and lets them execute until they either complete the subtask
                                      or fail. In the case of sensing, it might be desirable for the deliberator to make
                                      some of the global models being constructed available to the behaviors. For
                                      example, consider a robot vehicle following a road. Now suppose that it is
                                      looking for the large tree, since it is supposed to turn right at the intersec-
                                      tion after the tree. In order to maintain the correct sequence of behaviors,
                                      the deliberator has a global world model where trees are noted. The reactive
                                      behaviors of following a road and avoid an obstacle do not need the explicit
                                      representation of a tree. But what if the tree casts a shadow which might con-
                                      fuse the follow-path behavior and cause it to give the robot incorrect steering
                                      commands? In this case, it would be useful if the information about the pres-
                                      ence of a tree could be absorbed by the behavior. One way to do this is to
                                      permit methods on the world model to act as virtual sensors or perceptual
                                      schema. Then the follow-road behavior could use the virtual sensor, which
                                      tells the behavior when the road boundaries extracted by the vision sensor
                                      are probably distorted, to ignore affected regions in the image that would
                                      normally be observed by the vision sensor. This is an example of how the
                   SELECTIVE ATTENTION  deliberative layer can aid with selective attention, or filtering of perception for
                                      a context.
                                        But the reactor may wish to trigger deliberation under many circumstances.
                                      Most of these cases involve failure or exception handling. Suppose that a
                                      sensor breaks and the perceptual schema cannot produce a percept. The
                                      schema cannot diagnose itself. Therefore, its failure would cause a delib-
                                      erative function, such as the Sensing Manager in SFX, to be triggered. If the
                                      Sensing Manager cannot find a replacement perceptual schema or equivalent
                                      behavior, then it can no longer meet the constraints imposed by the Mission
   299   300   301   302   303   304   305   306   307   308   309