Page 275 - Designing Sociable Robots
P. 275
breazeal-79017 book March 18, 2002 14:28
256 Index
Embodied systems (cont.) Evaluation criteria
overview, 19 attention system, 72–77
personal robots, 24–25 gain adjustment on looking preference, 74
Emergent scaffolding, 32 gain adjustment on saliency, 72–74
Emotion. See also specific type socially manipulating attention, 74–77
components of, 112–114 design issues for sociable robots, 48–49
processes, 165 facial animation system, 180–182
Emotion arbitration subsystem, 96, 118–119 vision system, 72–77
Emotion elicitor stage, 96, 116–117 gain adjustment on looking preference, 74
Emotion system gain adjustment on saliency, 72–74
activation level and, 117–118 socially manipulating attention, 74–77
affective appraisal and, 115–116 Evaluation metrics, challenge of, 239
affective subsystem of, 95 Exaggeration parameter, 188
appraisal system and, 106, 115–116 Exploratory responses, 36
auditory system and, integrating, 94–97 Expression threshold level, 118
basic emotions and, 106–107 Expressive feedback, 86
components of emotion and, 112–114 Expressive speech, 185
Ekman six and, 96, 107, 111 Expressive versatility, 158
emotion arbitration subsystem of, 96, Expressive vocalization system. See also
118–119 Vocalization system
emotional elicitor stage and, 96, 116–117 design issues, 185–186
extension to, 124–125 emotions in human speech and, 186–187
of Kismet, 94–97, 107, 124–125 expressive voice synthesis and, 187–197
in living systems, 105–107 articulation, 193
negative emotions in, 106 overview, 187–190
overview, 110–111 pitch parameters, 188, 190–192
personality and, 124–125 timing parameters, 188, 192
positive emotions in, 106 voice-quality, 188, 192–193
releasers and, 94–95, 114–115 generating utterances, 194–198, 209
relevance-detection system and, 106 implementation overview, 194
response-preparation system and, 106 of Kismet, 147, 185–186, 195–203
responses and, 111–112 limitations and extensions, 208–210
secondary emotions and, 124 mapping vocal affect parameters to synthesizer
somatic marker process and, 95–97, 115 settings, 194–195
Emotive facial expression real-time lip synchronization and facial animation
generating, 165–170 and, 203–208
subsystem, 161–163 Expressive voice synthesis, 187–193
Empathy, 9, 237–238 Extraction system, low-level, 44
Engage-people behavior, 66, 141 Eye detection, 69–70
Engage-toys behavior, 66, 137–138, 140–141, 221 Eye movement
Engagement behaviors, 66, 137–138, 140–141, 221 attention system and, 68, 218–220
Engagement, inferring level of, 225 human, 211–213
Entrainment by infant, 35 oculo-motor system and, 218–220
Environment-regulation level of behavior, similar, 215
140–141
Ethology Face control, levels of
behavior system and, 128–132 facial function layer, 161–163
behavior groups, 130–131, 165 motor demon layer, 158–159
behavior hierarchies, 131–132 motor primitives layer, 159–160
behaviors, 129 motor server layer, 160–161
motivational contributions, 130 overview, 158
overview, 128–129 Face-to-face interaction, 144
perceptual contributions, 129–130 Facial action coding system (FACS),
design issues for sociable robots and, 42–43 173–175

