Page 80 - Socially Intelligent Agents Creating Relationships with Computers and Robots
P. 80

Cooperative Interface Agents                                      63

                              culty in employing the cited models is that traits are defined through natural
                              language descriptions and are not easily formalised into the “mental state” of
                              an agent. The first and most relevant contribution to a cognitive theory of per-
                              sonalities was due to Carbonell [4], who saw them as combinations of degrees
                              of importance assigned to goals. A second example, to which we will refer in
                              particular in this chapter, is Castelfranchi and Falcone’s theory of cooperation
                              in multi-agent systems [5].
                                Although affective expressions may contribute to increase interface agents’
                              friendliness, its acceptability is driven by the level of help provided to the user,
                              that is by its “cooperation attitude”. This level of help should not be equal for
                              all users but should be tailored to their attitudes towards computers in general,
                              and towards the specific software to which the agent is applied in particular.
                              These attitudes may be synthesised in a level of delegation of tasks that the
                              user adopts towards the agent. To select the helping attitude that best suits
                              the user needs, the agent has to be endowed with a reasoning capacity that
                              enables it to observe the user, to model her expected abilities and needs and to
                              plan the “best” response in every context. We had already applied the theory
                              of Castelfranchi and Falcone to formalise the mental state of agents and their
                              reasoning capacities in our a Project GOLEM [6]. With the project described
                              in this chapter, we extend that research in the direction of embodied animated
                              agents.
                                XDM-Agent is an embodied animated character that helps the user in per-
                              forming the tasks of a given application; its cooperation attitude changes ac-
                              cording to the user and the context. Although the agent is domain-independent,
                              we will take electronic mail as a case study, to show some examples of how it
                              behaves in helping to use Eudora. In a software of wide use like this, all proce-
                              dures should be very natural and easy to perform. The first goal of XDM-Agent
                              is then “to make sure that the user performs the main tasks without too much effort”.
                              At the same time, the agent should avoid providing too much help when this is
                              not needed; a second goal is therefore “to make sure that the user does not see the
                              agent as too intrusive or annoying”. These general goals may specialise into more
                              specific ones, according to the “cooperation attitude” of the agent. In deciding
                              the level and the type of help to provide, XDM-Agent should consider, at the
                              same time, the user experience and her “delegation attitude”. The agent’s de-
                              cision of whether and how to help the user relies on the following knowledge
                              sources:

                              Own Mental State.     This is the representation of the agent’s goals (Goal
                              XDM (Tg)) and abilities (Bel XDM (CanDo XDM a)) and the actions it intends to
                              perform (Bel XDM (IntToDo XDM a)).
   75   76   77   78   79   80   81   82   83   84   85