Page 129 - Sensing, Intelligence, Motion : How Robots and Humans Move in an Unstructured World
P. 129

104    MOTION PLANNING FOR A MOBILE ROBOT

           How much variety and how many algorithms are there in Classes 1 and 2?
           For Class 1, the answer is simple: At this time, algorithm Bug1 is the only
           representative of Class 1. The future will tell whether this represents just the
           lack of interest in the research community to such algorithms or something else.
           One can surmise that it is both: The underlying mechanism of this class of
           algorithms does not promise much richness or unusual algorithms, and this gives
           little incentive for active research.
              In contrast, a lively innovation and variety has characterized the development
           in Class 2 algorithms. At least a dozen or so algorithms have appeared in literature
           since the problem was first formulated and the basic algorithms were reported.
           Since some such algorithms make use of the types of sensing that are more
           elaborate than basic tactile sensing used in this section, we defer a survey in
           this area until Section 3.8, after we discuss in the next section the effect of more
           complex sensing on sensor-based motion planning.


           3.6 VISION AND MOTION PLANNING

           In the previous section we developed the framework for designing sensor-based
           path planning algorithms with proven convergence. We designed some algorithms
           and studied their properties and performance. For clarity, we limited the sensing
           that the robot possesses to (the most simple) tactile sensing. While tactile sens-
           ing plays an important role in real-world robotics—in particular in short-range
           motion planning for object manipulation and for escaping from tight places—for
           general collision avoidance, richer remote sensing such as computer vision or
           range sensing present more promising options.
              The term “range” here refers to devices that directly provide distance informa-
           tion, such as a laser ranger. A stereo vision device would be another option. In
           order to successfully negotiate a scene with obstacles, a mobile robot can make
           a good use of distance information to objects it is passing.
              Here we are interested in exploring how path planning algorithms would be
           affected by the sensing input that is richer and more complex than tactile sensing.
           In particular, can algorithms that operate with richer sensory data take advantage
           of additional sensor information and deliver better path length performance—to
           put it simply, shorter paths—than when using tactile sensing? Does proximal
           or distant sensing really help in motion planning compared to tactile sensing,
           and, if so, in what way and under what conditions? Although this question is far
           from trivial and is important for both theory and practice (this is manifested by a
           recent continuous flow of experimental works with “seeing” robots), there have
           been little attempts to address this question on the algorithmic level.
              We are thus interested in algorithms that can make use of a range finder or
           stereo vision and that, on the one hand, are provably correct and, on the other
           hand, would let, say, a mobile robot deliver a reasonable performance in nontriv-
           ial scenes. It turns out that the answers to the above question are not trivial as
           well. First, yes, algorithms can be modified so as to take advantage of better sens-
           ing. Second, extensive modifications of “tactile” motion planning algorithms are
   124   125   126   127   128   129   130   131   132   133   134