Page 423 - Sensing, Intelligence, Motion : How Robots and Humans Move in an Unstructured World
P. 423

398    SENSITIVE SKIN—DESIGNING AN ALL-SENSITIVE ROBOT ARM MANIPULATOR

           alternative—move so slowly that those forces would be contained—is simply
           not realistic.
              Then, why not use vision instead? The short answer is that since an arm
           manipulator operates in a workspace that is comparable in size to the arm itself,
           vision will be less effective for motion planning than proximity sensing that
           covers the whole arm. By and large, humans and animals use whole-body (tactile)
           sensing rather than vision for motion planning at small distances—to sit down
           comfortably in a chair, to delicately avoid an overactive next-chair neighbor on
           an aircraft flight, and so on. See more on this below.
              This discussion suggests that except for some specific tasks that require a
           physical contact of the robot with other objects—such as the robot assembly,
           where the contact occurs in the robot wrist—tactile sensing is not a good sensing
           media for robot motion planning. Proximity sensing is a better sensing candidate
           for the robot sensitive skin.

           Ability to Measure Distances. When the robot’s proximal sensor detects an
           obstacle that has to be dealt with to avoid collision, it is useful to know not only
           which point(s) of the robot body is in danger, but also how far from that spot
           the obstacle is. In Figure 8.4, if in addition to learning from sensor P about a
           nearby obstacle the arm would also know the obstacle’s distance from it—for

           example, that the obstacle is in position O and not O —its collision-avoiding
           maneuver could be much more precise. Similar to a higher sensor resolution,
           an ability to measure distances to obstacles can improve the dexterity of robot
           motion. In mobile robots this property is common, with stereo vision and laser
           ranger sensors being popular choices. For robot arms, given the full coverage
           requirement, realizing this ability is much harder.
              For example, at the robot-to-obstacle distances that we are interested in, 5
           to 20 cm, the time-of-flight techniques used in mobile robot sensors are hardly
           practical for infrared sensors: The light’s time of flight is too short to detect


                                                O′
                                                O
                                      l 2
                                              P





                                            1
                                             l





           Figure 8.4  Knowing the distance between the robot and a potential obstacle translates
           into better dexterity of the arm’s motion.
   418   419   420   421   422   423   424   425   426   427   428