Page 72 - Designing Sociable Robots
P. 72

breazeal-79017  book  March 18, 2002  14:1





                       The Physical Robot                                                    53





                         The motor outputs include vocalizations, facial expressions, and motor capabilities to
                       adjust the gaze direction of the eyes and the orientation of the head. Note that these motor
                       systems serve to steer the visual and auditory sensors to the source of the stimulus and can
                       also be used to display communicative cues. The choice of these input and output modalities
                       is geared to enable the system to participate in social interactions with a human, as opposed
                       to traditional robot tasks such as manipulating physical objects or navigating through a
                       cluttered space. Kismet’s configuration is most clearly illustrated by watching the included
                       CD-ROM’s introductory “What is Kismet?” section. A schematic of the computational
                       hardware is shown in figure 5.2.


                                                     Cameras
                                                 Eye, Neck, Jaw Motors
                                                                       Ear, Eyebrow, Eyelid,
                                                                         Lip Motors

                                                   QNX
                                                                           L
                                                Motor  Attent.  Eye
                                                 Ctrl  System  Finder  dual-port
                                                                RAM     Face  Percept
                                       CORBA                           Control  & Motor
                                                     Dist.  Motion
                                                Tracker  to
                                                          Filter
                                                     Target                  Drives &
                                                                       Emotion
                                                                             Behavior
                                                          Audio
                                    NT          Skin  Color  Speech
                       Speakers                 Filter  Filter  Comms
                                  Speech Synthesis
                                 Affect Recognition
                                                      CORBA
                                  CORBA
                                           Linux
                                            Speech
                                           Recognition
                                           Microphone
                       Figure 5.2
                       Kismet’s hardware and software control architectures have been designed to meet the challenge of real-time pro-
                       cessing of visual signals (approaching 30 Hz) and auditory signals (8 kHz sample rate and frame windows of
                       10 ms) with minimal latencies (less than 500 ms). The high-level perception system, the motivation system, the
                       behavior system, the motor skills system, and the face motor system execute on four Motorola 68332 micropro-
                       cessors running L, a multi-threaded Lisp developed in our lab. Vision processing, visual attention, and eye/neck
                       control are performed by nine networked 400 MHz PCs running QNX (a real-time Unix-like operating system).
                       Expressive speech synthesis and vocal affective intent recognition runs on a dual 450 MHz PC running Windows
                       NT, and the speech recognition system runs on a 500 MHz PC running Linux.
   67   68   69   70   71   72   73   74   75   76   77