Page 28 - Designing Sociable Robots
P. 28
breazeal-79017 book March 18, 2002 13:50
The Vision of Sociable Robots 9
that allow humans to correctly attribute beliefs, goals, perceptions, feelings, and desires to
the self and to others (Baron-Cohen, 1995; Leslie, 1994). Other sophisticated mechanisms
such as empathy are used to understand the emotional and subjective states of others. These
capabilities allow people to understand, explain, and predict the social behavior of others,
and to respond appropriately.
To emulate human social perception, a robot must be able to identify who the person
is (identification), what the person is doing (recognition), and how the person is doing it
(emotive expression). Such information could be used by the robot to treat the person as an
individual, to understand the person’s surface behavior, and to potentially infer something
about the person’s internal states (e.g., the intent or the emotive state). Currently, there are
vision-based systems capable of identifying faces, measuring head pose and gaze direction,
recognizing gestures, and reading facial expressions. In the auditory domain, speech recog-
nition and speaker identification are well-researched topics, and there is a growing interest
in perceiving emotion in speech. New techniques and sensing technologies continue to be
developed, becoming increasingly transparent to the user and perceiving a broader reper-
toire of human communication behavior. Not surprisingly, much of Kismet’s perceptual
system is specialized for perceiving and responding to people.
For robots to be human-aware, technologies for sensing and perceiving human behavior
must be complemented with social cognition capabilities for understanding this behavior in
social terms. As mentioned previously, humans employ theory-of-mind and empathy to infer
and to reflect upon the intents, beliefs, desires, and feelings of others. In the field of narrative
psychology, Bruner (1991) argues that stories are the most efficient and natural human way
to communicate about personal and social matters. Schank & Abelson (1977) hypothesize
that stories about one’s own experiences and those of others (in addition to how these stories
are constructed, interpreted, and interrelated) form the basic constituents of human memory,
knowledge, social communication, self understanding, and the understanding of others. If
robots shared comparable abilities with people to represent, infer, and reason about social
behavior in familiar terms, then the communication and understanding of social behavior
between humans and robots could be facilitated.
There are a variety of approaches to computationally understanding social behavior.
Scassellati (2000a) takes a developmental psychology approach, combining two popular
theories on the development of theory of mind in children (that of Baron-Cohen [1995]
and Leslie [1994]), and implementing the synthesized model on a humanoid robot. In
the tradition of AI reasoning systems, the BDI approach of Kinny et al. (1996) explicitly
and symbolically models social expertise where agents attribute beliefs, desires, intents,
abilities, and other mental states to others. In contrast, Schank & Abelson (1977) argue
in favor of a story-based approach for representing and understanding social knowledge,
communication, memory, and experience. Dautenhahn (1997) proposes a more embodied

