Page 87 - Designing Sociable Robots
P. 87

breazeal-79017  book  March 18, 2002  14:2





                       68                                                               Chapter 6





                       the top three regions to both the eye motor control system and the behavior and motivational
                       systems.
                         The most salient region is the new visual target. The individual feature map scores of the
                       target are passed onto higher-level perceptual stages where these features are combined to
                       form behaviorally meaningful percepts. Hence, the robot’s subsequent behavior is organized
                       about this locus of attention.

                       Attention Drives Eye Movement
                       Gaze direction is a powerful social cue that people use to determine what interests others.
                       By directing the robot’s gaze to the visual target, the person interacting with the robot can
                       accurately use the robot’s gaze as an indicator of what the robot is attending to. This greatly
                       facilitates the interpretation and readability of the robot’s behavior, since the robot reacts
                       specifically to the thing that it is looking at.
                         The eye-motor control system uses the centroid of the most salient region as the target of
                       interest. The eye-motor control process acts on the data from the attention process to center
                       the eyes on an object within the visual field. Using a data-driven mapping between image
                       position and eye position, the retinotopic coordinates of the target’s centroid are used to
                       compute where to look next (Scassellati, 1998). Each time that the neck moves, the eye/neck
                       motor process sends two signals. The first signal inhibits the motion detection system for
                       approximately 600 ms, which prevents self-motion from appearing in the motion feature
                       map. The second signal resets the habituation state, described in the next section. A detailed
                       discussion of how the motor component from the attention system is integrated into the rest
                       of Kismet’s visual behavior (such as smooth pursuit, looming, etc.) appears in chapter 12.
                       Kismet’s visual behavior can be seen in the sixth CD-ROM demonstration titled “Visual
                       Behaviors.”
                       Habituation Effects

                       To build a believable creature, the attention system must also implement habituation effects.
                       Infants respond strongly to novel stimuli, but soon habituate and respond less as familiarity
                       increases (Carey & Gelman, 1991). This acts both to keep the infant from being continually
                       fascinated with any single object and to force the caregiver to continually engage the infant
                       with slightly new and interesting interactions. For a robot, a habituation mechanism removes
                       the effects of highly salient background objects that are not currently involved in direct
                       interactions as well as placing requirements on the caregiver to maintain interaction with
                       different kinds of stimulation.
                         To implement habituation effects, a habituation filter is applied to the activation map
                       over the location currently being attended to. The habituation filter effectively decays the
   82   83   84   85   86   87   88   89   90   91   92