Page 196 - Socially Intelligent Agents Creating Relationships with Computers and Robots
P. 196

Experiences with Sparky, a Social Robot                          179

                              inappropriate. A friendly robot usually prompted subjects to touch the robot,
                              mimic its motions and speak out loud to it. With the exception of older boys, a
                              sad, nervous or afraid robot generally provoked a compassionate response.
                                Our interactions with users showed a potential need for future (autonomous)
                              social robots to have a somewhat different sensory suite than current devices.
                              For instance, we found it very helpful in creating a rich interaction to “sense”
                              the location of bodies, faces and even individual eyes on users. We also found
                              it helpful to read basic facial expressions, such as smiles and frowns. This
                              argues for a more sophisticated vision system, one focused on dealing with
                              people. Additionally, it seemed essential to know where the robot was being
                              touched. This may mean the development of a better artificial skin for robots.
                              If possessed by an autonomous robot, the types of sensing listed above would
                              support many of the behaviors that users found so compelling when interacting
                              with a teleoperated Sparky.
                                Fortunately, there are some traditional robotic skills that Sparky, if it were
                              autonomous, might not need. For instance, there was no particular need for ad-
                              vanced mapping or navigation and no need, at least as a purely social creature,
                              for detailed planning. A robot that could pay attention to people in its field of
                              view and had enough navigation to avoid bumping into objects would probably
                              do quite well in this human sphere. Even if future robots did occasionally bump
                              into things or get lost, it shouldn’t be a problem: Sparky was often perceived
                              as acting reasonably even when a serious control malfunction left it behaving
                              erratically. When the goal is to be perceived as “intelligent”, there are usually
                              many acceptable actions for a given situation. Though it will be challenging
                              to build these new social capabilities into mobile robots, humans are perhaps a
                              more forgiving environment than roboticists are accustomed to.
                                We close on a speculative, and perhaps whimsical, note. Users interacted
                              with Sparky using their bodies and, in turn, received feedback using this same,
                              nearly universal, body language. This left us thinking not only of robots, but
                              also of the general question of communication in computer interfaces. What
                              if these human-robot interactions were abstracted and moved into other realms
                              and into other devices? For instance, the gestures of head motion and gaze
                              direction could map readily to a device’s success at paying attention to a user.
                              Similarly, Sparky could intuitively demonstrate a certain energy level using its
                              posture and pace. Could another device use this technique to show its battery
                              state? Though our research didn’t focus on these questions, we believe this
                              could be fertile ground for future work.

                              Notes

                              ∗ Contact author: mark@markscheeff.com.
   191   192   193   194   195   196   197   198   199   200   201