Page 143 - Designing Sociable Robots
P. 143
breazeal-79017 book March 18, 2002 14:5
124 Chapter 8
with the expectation that its stimulation-drive be reduced. This would constitute a
simple act of meaning.
Extensions to emotions Kismet’s drives relate to a hardwired preference for certain
kinds of stimuli. The power of the emotion system is its ability to associate affective qual-
ities to different kinds of events and stimuli. As discussed in chapter 7, the robot could
have a learning mechanism by which it uses the caregiver’s affective assessment (praise
or prohibition) to affectively tag a particular object or action. This is of particular impor-
tance if the robot is to learn something novel—i.e., something for which it does not already
have an explicit evaluation function. Through a process of social referencing (discussed in
chapter 3) the robot could learn how to organize its behavior using the caregiver’s affective
assessment. Human infants continually encounter novel situations, and social referencing
plays an important role in their cognitive, behavioral, and social development.
Another aspect of learning involves learning new emotions. These are termed secondary
emotions (Damasio, 1994). Many of these are socially constructed through interactions with
others.
As done in Picard (1997), one might pose the question, “What would it take to give Kismet
genuine emotions?” Kismet’s emotion system addresses some of the aspects of emotions
in simple ways. For instance, the robot carries out some simple “cognitive” appraisals. The
robot expresses its “emotional” state. It also uses analogs of emotive responses to regulate
its interaction with the environment to promote its “well-being.” There are many aspects of
human emotions that the system does not address, however, nor does it address any at an
adult human level.
For instance, many of the appraisals proposed by (Scherer, 1994) are highly cognitive
and require substantial social knowledge and self awareness. The robot does not have any
“feeling” states. It is unclear if consciousness is required for this, or what consciousness
would even mean for a robot. Kismet does not reason about the emotional state of others.
There have been a few systems that have been designed for this competence that employ
symbolic models (Ortony et al., 1988; Elliot, 1992; Reilly, 1996). The ability to recognize,
understand, and reason about another’s emotional state is an important ability for having
a theory of mind about other people, which is considered by many to be a requisite of
adult-level social intelligence (Dennett, 1987).
Another aspect I have not addressed is the relation between emotional behavior and
personality. Some systems tune the parameters of their emotion systems to produce synthetic
characters with different personalities—for instance, characters who are quick to anger,
more timid, friendly, and so forth (Yoon et al., 2000). In a similar manner, Kismet has its
own version of a synthetic personality, but I have tuned it to this particular robot and have

