0% found this document useful (0 votes)
17 views8 pages

SS04 02 011

Uploaded by

Ahmed Dridi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views8 pages

SS04 02 011

Uploaded by

Ahmed Dridi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

BDIE: a BDI like Architecture with Emotional Capabilities

David J. Hernández and Oscar Déniz and Javier Lorenzo and Mario Hernández
Institute for Intelligent Systems and Numerical Applications in Engineering
Universidad de Las Palmas de Gran Canaria
Edificio Central del Parque Cientı́fico-Tecnológico
Campus Universitario de Tafira, 35017, Las Palmas, Spain
[email protected]
{odeniz, jlorenzo, mhernandez}@dis.ulpgc.es

Abstract decision-making process would improve its performance as


argued below.
In this paper, a first version of the BDIE architecture is pre- This paper presents the BDIE architecture, an attempt to
sented. This architecture is an attempt to model affective phe- capture the effects of emotions in the human mind and repro-
nomena and adds a new emotional component to a BDI agent.
The new component affects to the others trying to take advan-
duce them in artificial agents. The main goal is to develop
tage of the use of emotions for the development of artificial and evolve an architecture that should allow the creation of
agents. The architecture has been implemented in an object- artificial emotional agents and use them in several scenarios
oriented framework that allows to easily define new agents. to test their suitability.
As an example of an agent, initial testing has been done with
a robotic head. The Base: a BDI Architecture
The approach starts from the BDI architecture (Rao &
Introduction Georgeff 1995; Kinny, Rao, & Georgeff 1996) and adds to
it the necessary components to endow it with emotional ca-
In the last years emotions have been receiving increasing pabilities. This decision was made based on the following
interest from the Artificial Intelligence research community. criteria:
Several authors are turning their attention to the study of
1. This architecture clearly separates what conceptually ev-
emotions mainly due to two reasons:
ery agent must have, that is, perceptions, goals, and be-
1. The discovered role of emotions in rational think- haviors. This allows a clear distinction of the different
ing (Damasio 1994): neurological experiments have data structures that in other architectures are mixed. Thus,
demonstrated the importance of emotions in it. it is possible to be focused on each component for its de-
sign and implementation.
2. The interest in affective computing (Picard 1997): the
idea is that if computers can recognize, express and prob- 2. The planning capabilities allow the agent to produce plans
ably have emotions, the interaction with the user will be as complex as desired. The result of the decision making
improved because the computer can analyze the affective process can be just the next action or a complex sequence
state of the user and choose a different action course de- of actions to execute over time. This allows to change
pending on it. the planning algorithm according to the needs. Gener-
ated plans may be used to compute the effect produced
Today is widely accepted that emotion is important in ra- by a giving action sequence over the emotional system as
tional human activity. As pointed out in (Alvarado, Adams, pointed out in (Gratch 1999).
& Burbeck 2002), the question is no longer whether a de-
signer of an architecture for artificial agents should or not 3. The architecture is sufficiently powerful to achieve the de-
incorporate emotions, but how to do so. sign goals to be reached, that is, to develop an architec-
ture that produces a more human decision making process
We are interested in the role of emotions in the field of ar-
by adding the capabilities to have, recognize and express
tificial agents. First, we have worked in human-robot inter-
emotions.
action and the introduction of emotions would improve that
interaction (Déniz & others 2002) due to the robot’s capa- Figure 1 shows a basic view of the BDI architecture’s
bility to express emotions. Second, we also have interest in components. There are two types of information compo-
the application of artificial agents to computer games, where nents:
emotional capabilities would allow a much better user expe- • Data containers which store the agent’s information.
rience because agents would have a more human decision Three main stores are found:
making process. Finally, the use of emotions guiding the
1. Beliefs. This container stores what the agent knows
Copyright c 2004, American Association for Artificial Intelli- about its environment (perceptions) and, possibly,
gence (www.aaai.org). All rights reserved. about itself (proprioception).
– Its inputs are the belief which, in turn, come from the
E sensors.
n Perceptual Desire
Processes Generator
– It modifies the perceptual processes, representing that
v
i
Beliefs Desires the same input stimuli may be perceived differently
r depending on the affective state. For example, if the
o active emotion has an associated high arousal state, a
n Proprioception
stressful situation, the sensors may stop working.
m Intentions
Planning • Relationships with the motivational system:
e and
n Execution – Current goals affect the current emotional state. For
Algorithm
t example, when a goal is satisfied, it would be possible
to have an emotional response of happiness.
– It activates goals when a primary emotion becomes ac-
Figure 1: The BDI architecture components tive in order to respond in presence of a special situ-
ation. This goal will guide the decision making pro-
Guide cess to the execution of a behavior that will try to pre-
Perceptual Motivational
Action
Behavior serve the agent’s welfare. Our decision to create this
Selection
System System System link came from two side. On the one hand, as argued
(Beliefs) (Desires) (Intentions) in (Canamero 2000) emotions should be connected to
Input to
Activate
intentions through goals in order to increase the archi-
Emotions Guide Desires tecture’s modularity. On the other hand, as pointed out
emotions
Modify the
(Primary
Action in (Castelfranchi 2000), emotions may be sources of
Emotions)
Perceptual Selection goals as one of the roles in the complex relationship
Processes Tuning
Emotional
that exists between them. However, the fact that emo-
System tions do not directly communicate with the intentions
module does not mean that fast responses, as reflexes,
Update can not be implemented. In the BDIE architecture,
Proprioceptive
Information whenever a primary emotion (see below) is activated,
a high priority desire is unsatisfied. This produces that
Figure 2: BDIE Functional View the planning algorithm turns all its resources to solve
the current situation. Additionally, limitations in the
algorithm’s processing time in order to satisfy the real-
time necessity of a behavior execution are planned to be
2. Desires. In this container, the agent’s goals are found. introduced if needed. For example, in a behavior net-
Goals guide the decision making process. work (Maes 1989), that could means, that the energy is
3. Intentions. The decision making process produces a not allowed to jump from one behavior to other more
plan that will be executed over time. This plan and its than a certain amount. This produces a more reactive
related information are stored in this container. (vs more deliberative) response of the net.
• Data processing algorithms: • Relationships with the behavior system:
1. Perceptual Processes. These include the sensors that – It tunes the action selection algorithm. In this case, the
receive the information from the environment and the emotional system acts as a metaheuristic, a mechanism
own agent and update the beliefs. that guides the action selection algorithm through the
2. Desire Generator. This process may create new desires vast space of available options to execute. Concep-
given the current desires and intentions. tually, it acts as a pruning method of the state space,
3. Planning and Execution Algorithm. In this module the removing those options that are not emotionally good.
planning algorithm generates a plan to achieve the cur- This allows the algorithm to concentrate on a smaller
rent goals while another algorithm executes the differ- set of options, discarding the emotionally bad ones.
ent components of this plan. Additionally, certain experiments have shown that the
emotional state affects the capability of individuals to
solve problems so that, for example, when the individ-
BDIE ual is sad it takes more time to solve a problem than
Figure 2 shows a functional view of the architecture with when it is happy. In this line, work is being carried on
the different modules, including the emotional one, and the a dynamic reconfigurable behavior network whose pa-
relationships between them. The new module contains both rameters, structure and update cycle will depend on the
data containers and processes that will be introduced below. current affective state.
As shown in the figure, it has dependencies with all the other – It modifies the execution of a behavior producing a
components in the architecture:
more affective output. For example, it would be pos-
• Relationships with the perceptual system: sible to speak with a different intonation or to show a
Emotions

Tertiary

Secondary Current
E Perceptual Plan
Processes Primary
n
v
i Desires
r S1 1st 2nd 3er
o 0
S2 Decision
l l l
n Sn 1 Making
e e e
m Process
v v v 2
e e e e
n l l l 3
t
Beliefs Priority
level

Figure 3: BDIE Structural View

facial expression. contrast, emotions are the result of other processes which
Figure 3 shows a structural view of the architecture with make such evaluation. These other processes perform the
all the components which will be detailed in the following affective appraisal and cognitive evaluation of input stim-
sections. uli (Castelfranchi 2000). Affective appraisal is a fast and
often unconscious evaluation of the input stimuli, usually
Emotional system implemented by reactive mechanisms trained to act quickly
The emotional system keeps track of the agent’s affective in certain situations. In contrast, cognitive evaluation is a
state. In BDIE the Damasio’s (Damasio 1994) and Slo- more elaborated evaluation of the stimuli and is usually per-
man’s (Sloman 2000) views have been combined. There are formed by high level processes.
three types of emotions: These concepts are mapped in BDIE introducing evalua-
tors. There are first level evaluators, those which perform
• Primary emotions. The primary emotions are related with
the affective appraisal, and second level evaluators, those
fundamental life tasks so that the individual can survive.
that perform the cognitive appraisal. They are associated re-
They are fast and prewired mechanisms that are activated
spectively with primary and secondary emotions. First level
whenever a very specific situation occurs. The primary
evaluators look for first level beliefs (see below) and detect
emotions take control over the normal decision making
special situations that fire a primary emotion. Usually, they
process in order to rapidly execute a behavior that guar-
are simple processes such as threshold exceeding tests. On
antees the agent survival. Examples of primary emotions
the other hand, second level evaluators are more complex
are fear and surprise.
processes that look for several first level beliefs and com-
• Secondary emotions. This emotions are related with high bines them into a second level one. They may probably use
level cognitive processes that interpret the perceptual in- information from other systems such as desires or long term
formation and modify the affective state according to that memory.
information. They are associated with other functions As these processes are in charge of stimuli evaluation,
of emotions such as interaction and communication with emotions are just the output of them. This conceptually
other individuals. Typical examples of this category are fits in both, the psychological and the software engineering
happiness, sadness and anger. fields (allowing a clearer design and implementation).
• Tertiary emotions. Currently not implemented, they are
The Emotional Space Classically, there have been two
included for completeness in the design. They are related
representations for emotions in artificial agents:
to reflective data and may include emotions as shame or
pride. They will be the next step of the BDIE architecture. • A set of discrete emotions where each one is represented
Although this classification exists in BDIE, emotions are separately. This representation comes mainly from the
not viewed as processes that evaluate the input stimuli. In work of Paul Ekman (Ekman 1999) where there is a basic
set of six emotions. This set is the base of a six dimen-
sional space where all the other emotions can be placed A Descriptive
as a combination of the basic ones. Examples of this Point

representation can be found in (Velásquez 1997; 1998b; x Maximum

1998a). E2 E1
Intensity

Minimum
• A continuous two dimensional space with valence and Intensity
arousal as the axes (Breazeal 2002). In this case, the
space has only two dimensions and each affective phe- E3
nomenon has a pair of valence and arousal associated val-
ues that allows to move the current emotional state, a point
in the space, within it. The valence represents how favor- V
able or unfavorable is the stimulus to the agent while the E5
arousal means how arousing is it. Sometimes, a third axis E4
associated with the stance is added to this scheme, repre-
senting how approachable is the stimulus.
Actually, the two representations are not exclusive, that
is, they can be combined. (Breazeal 2002) is an example of
this combined representation. In it, each emotion has an as- Figure 4: The emotional space
sociated area of the space. If the current point, representing
the current emotional state, is inside one of these associated
areas, it is said that the emotion is the current one. We also
have decided to use this type of representation because it is in the space around the origin, where no emotion is de-
allows us to easily obtain emotional information from the fined. This current emotional point is located anywhere in
perceptual information. As each input stimulus has an as- the space and is moved according to the input stimuli. As
sociated pair of valence and arousal values, it can serve as said, it may suffer absolute or relative movements depend-
an input to the emotional system that moves the point in the ing on the output of the affective processes.
space representing the current emotional state according to The emotion’s parameters allow the agent designer to de-
the values of new stimuli. Primary emotions, due to its ne- fine personalities. For example, indicating a lower decay
cessity to act quickly, provide an absolute position in the rate means that an emotion that is active, will remain ac-
space so that the affective state changes immediately. Sec- tive longer than another emotion with a greater decay factor.
ondary emotions provide movements relative to the current Over the time, that means that is likely the agent will ex-
affective state. Thus, it changes gradually from one emotion periment during more time the emotion with lower decay
to another according to the input stimuli. factor. This is a way to model a mood. Also minimum and
In BDIE, the emotional space is divided in emotional sec- maximum intensities determine if the agent is more or less
tors. Each emotion delimits a sector in the space. Figure 4 sensible to certain stimuli, producing a agent which usually
shows an already divided emotional space. Each sector is experiment certain emotions.
described by:
Perceptual processes
• A descriptive point. A point in the space whose module Perceptual processes include all the processes that accept
indicates the emotion’s maximum intensity. The points input from sensors and generates first level beliefs. That
are used to compute the angles that conform each sector. includes any kind of perceptual processing, which makes
The maximum intensity is the sector outer arc’s radius. them compatible with any kind of attention filter or habit-
• A minimum intensity that indicates when the emotion is uation processes. The designer is able to implement differ-
considered to be inactive. This value is the sector inner ent sensors and add to them the functionality he/she wants.
arc’s radius. BDIE supports threaded sensors that run in their own threads
• A decay factor. It is applied to the current point when which are useful in environments as robotics. As indicated
no input stimulus has activated an emotion. In that case, below, the retrieval of sensors’ current values is the first step
the absence of activating stimuli produces that the current in the agent’s update cycle.
emotional point tends to go to the origin. The decay factor
is an amount by which the current point’s components are Beliefs
divided. As said before, beliefs represent the agent’s vision of its en-
vironment and itself. They are a set of elements like percep-
Additionally, each primary emotion has an extra config-
tual information arrived from the perceptual processes. In
uration parameter that does not participate in the emotional
BDIE there are three conceptual levels of beliefs each one
space. This parameter is the name of the desire dissatisfied
with a higher semantic level:
when the emotion is active.
When the system is loaded, the emotional space is com- • First level beliefs represent basic perceptual information,
puted given all the emotion’s descriptions. The neutral emo- basic stimuli come from perceptual processes. They
tion is considered active when the current emotional point have not gone through any post-processing algorithm that
would try to infer more information about the input. uation. For example, in a dangerous situation the high level
Each belief will contain the sensor information and, af- desires should be left away just to invest all the resources in
ter passing through the emotional module, the valence and solving such situation. That is what happens, for example,
arousal evaluation. As an example, an image and an audio if a primary emotion related to a survival task is activated.
signal coming respectively from a camera and a micro- Desire’s priority levels are dynamic, meaning that the de-
phone could be first level beliefs. The image belief could sire can move from one level to another. This allows to cover
have a high arousal value if an object is detected close to cases where the urgency of a given desire, that is, the ur-
the agent. Similarly, the audio signal could have a high gency with which the system has to try the satisfaction of it,
arousal value if the volume is too loud. changes. Thus, the agent is always focused on the most ur-
• Second level beliefs represent more elaborated beliefs. gent desires, that is, the more important ones in the current
They are the output of cognitive processes that combine situation. If a desire is important but not urgent it will be in
several first order beliefs into one or more second level a lower priority level and will receive its opportunity when
ones. Following with the previous example, an alert sec- the urgent ones are satisfied.
ond level belief could be generated from the two first level
belief image and an audio signal. The alert belief would Intentions and the Behavior System
obtain a resultant valence and arousal from a combination The intentions module contains the planning algorithm and
of the source’s two values. the current plan. As said before, the BDI architecture nicely
• Third level beliefs will be the future next step of our ar- separates the different modules of which is composed. Thus,
chitecture and will be related to tertiary emotions (Sloman the planning algorithm is completely interchangeable. In
2000). They will work in a similar fashion to second level our experiments, several algorithms such as a rule system
beliefs. There will be cognitive processes that will com- or a very simple reactive algorithm, have been used. The
bine different second level beliefs into one or more third connection of emotions with intentions through the goals
level beliefs. As tertiary emotions, they are not currently allows to keep this division. In the future, a the dynamic
implemented. reconfigurable behavior network commented before will be
introduced.
Desires
BDIE’s Update Cycle
In BDIE, desires comprise goals and homeostatic variables.
Goals are desires based in a predicate. Whenever a goal’s Finally, the BDIE architecture’s update cycle is presented.
predicate returns true, the systems considers that the goal The following steps are done in each cycle:
is satisfied. The inclusion of homeostatic variables, as an- 1. Sensors are updated getting from them the current input
other element that is able to guide the decision making algo- stimuli.
rithm, is because they are very useful in all kind of agents.
Thus, they can be found in several architectures to model 2. Input stimuli, in the form of beliefs, are introduced in the
concepts like thirst or hunger. Homeostatic variables act like first level of the belief container.
any other goal: when an homeostatic variable is in its nor- 3. First level evaluators are invoked to check for special sit-
mal range, it is considered as satisfied so it does not try to uations that may fire primary emotions.
guide the decision making process. Whenever it goes over
or under the normal range it will be considered as unsatis- 4. If a primary emotion is activated, the emotional space is
fied and will try that the decision making algorithm executes updated, its corresponding desire is unsatisfied and the ex-
a behavior that satisfy it. ecution control passes to the the planning algorithm.
In several cognitive architectures, desires have priority 5. If no primary emotion is activated then, the second level
values that classify them and guide the decision making al- evaluators are invoked. They compute new second level
gorithm to execute behaviors that will satisfy higher priority beliefs given the first level ones.
desires. In BDIE this concept is supported but raised to a
6. The emotional space is updated and the current emotion
higher level. Desires, that is, goals and homeostatic vari-
computed. In case it is a primary emotion, the point four
ables, have an importance value but, besides, they belong to
is executed.
a given priority level. As shown in Figure 3, several prior-
ity levels may exist in a given agent. The decision making 7. The control is transfered to the planning algorithm.
algorithm will try to satisfy the higher importance unsatis- 8. The next behavior is executed.
fied desire in the higher priority level. That means that if
desire A has a higher importance than desire B but B is in a
level with higher priority than A, then the decision making Experiments
algorithm will try to satisfy B first. The architecture has been implemented using the Java pro-
Whenever a desire of a given level is unsatisfied, all the gramming language. It is an object-oriented framework that
goals from the levels with lower priority are deactivated, due to the facilities given by the language is heavily based
even not allowing them to guide the decision making algo- on dynamic load and unload of modules and components.
rithm. This scheme allows the decision making process to For example, it is possible to add a new module to the archi-
be focused on the desires that really matter in the current sit- tecture without having to recompile it. The designer of an
A

Surprise

Fear

Sadness

Happiness
V
Anger
Figure 5: The Casimiro robot head

agent creates the classes extending others classes or imple-


menting interfaces from the architecture. At load time, all
Figure 6: An emotional space with five emotions
the modules will be loaded and configured.
To test the basic components that the architecture already
have, we implemented a simple agent in a robotic head de-
veloped in our group. The tests have allowed us to identify
If that is the case, the second level evaluator is not invoked
the design flaws and strengths. From them, changes have
and the corresponding first level emotion’s desire is unsatis-
been proposed and will be introduced in next versions of
fied. As an example of the effect of emotions in sensors, if a
BDIE.
first level emotion is activated the sensors stop working until
Casimiro (Déniz & others 2002) is a humanoid robotic the emotion becomes inactive. Then, the planning algorithm
head with vision and audio sensor capabilities and several computes the action it should execute to satisfy the desire,
mechanical degrees of freedom. For the experiments we use but in this case it does nothing. When the selected behavior
one of the cameras as a sensor and the head’s motors as ef- must be executed, the facial expression is updated given the
fectors. Figure 5 is a picture of the robotic head. current emotion.
The goal of the experiment was to produce facial expres-
sions for certain colors and the activation of primary emo- If no first level emotion is activated, then the second level
tions given the image’s luminance value. For this task, the evaluator is invoked. It uses the three color beliefs to com-
robot’s software provided us with four measures of the cur- pute a position in the [r,g,b] space. Previously, a table with
rent image: the luminance level and the three basic color points in this space for the colors that will be shown to the
components red, green and blue. robot has been created (see Table 1). Around each of these
In the emotional system there is one evaluator in the first key points, a cube is defined. If the current point is inside
and second levels, that is, there is one process in charge of any of these cubes, then the robot thinks that the correspond-
the affective appraisal and another one in charge of the cog- ing color is being shown. Additionally, each color has an as-
nitive evaluation. There are five emotions: happiness, sad- sociated valence and arousal. The corresponding emotional
ness, surprise, fear and anger. Surprise and fear are first level change, output of the current three considered input stim-
emotions while the others are second level ones. Figure 6 uli, will be these arousal and valence multiplied by a factor
shows the emotions distribution in the emotional space. that decreases with the distance to the key point. Thus if the
current color matches the key point, the valence and arousal
The desires system counts only with one desire, represent-
will be the one associated with the color. On the other hand,
ing the survival goal. It is in the highest priority level and
if the color is near a cube’s border, the arousal and valence
is always satisfied unless a primary emotion is activated. In
will be close to zero. Finally, a new second level belief is
that case, the desire is unsatisfied. The example tries to fo-
created with this computed arousal and valence.
cus on the emotional system and homeostatic variables have
the same influence than goals on it, so they are not included. After the evaluation process, the current emotional point
Finally, the intentions system is based on a very simple is updated given the evaluation of the belief recently created.
algorithm that shows a facial expression given the current As a result, the point navigates through the emotional space.
emotion. Note that a first level emotion may also be activated.
Each update cycle, the four beliefs, each one with one of Finally, the planning algorithm is invoked and the execu-
the sensed values, are created. In the next step the first level tion of the current algorithm requested. As said before, in
evaluator examine the belief corresponding to the luminance these experiments this sets the current facial expression ac-
and determines if a first level emotion should be activated. cording to the emotional state.
• An attention filter is necessary to filter out not related
Table 1: The color space sensory information. This module would be affected by
Red Green Blue Valence Arousal the emotional state. Experiments have shown that in high
Red 124 27 28 -10 +10 arousal situations, attention tend to focus in few elements
Green 5 105 85 0 0 than in normal situations. Thus, the input stimuli varies.
Blue 11 80 125 +5 -5
Yellow 135 140 0 +5 +5 • A habituation module is also necessary. The agent should
Orange 165 65 20 +7 +7 not always react the same way to the same stimuli. If
some stimuli are repeated, the agent should get habituated
Black 7 5 5 -10 -10
to them and does not react in the same way.
• Learning mechanisms seem to be pretty necessary also.
Conclusions and Future Work They will be related to the emotional module, with the so-
matic marker hypothesis (Damasio 1994) as a key point.
The initial revision of the BDIE architecture has been pre-
sented. This work has allowed us to research in the field of The goal of all this modules is to allow the designer to
emotions and to produce a first model of an architecture that create agents with higher behavior complexity in an attempt
will evolve as the research advances. The architecture is il- to make them more believable.
lustrated with a simple example which allowed us to catch
several design flaws and strengths and, to observe the prop- Acknowledgments
erties of the selected valence-arousal space. Thus, several This work has been partially funded by a personal grant
improvements, design changes and alternative representa- from the University of Las Palmas de Gran Canaria to
tions have emerged. We will put effort in the analysis and the first author. Additional funds come from research
implementation of those proposals. The experiments have project UNI2002/16 of the same university and from re-
proved that the architecture is valid in an application with a search project PI2003/165 of the Canary Islands Govern-
real robotic head but should be improved if more complex ment.
behavior is desired. The authors would also like to thanks the reviewer of the
The main strength of the architecture is the fact that the initially submitted long abstract due to the insightful com-
emotional state affects to all the other components in the ar- ments and very useful suggestions.
chitecture. This reflects the fact that in minds, the emotions
are deeply related to other systems. The intention is not to References
create perfect agents but to create believable ones and this Alvarado, N.; Adams, S. S.; and Burbeck, S. 2002. The
may imply to take wrong decisions. role of emotion in an architecture of mind.
The chosen emotional representation is not enough for
complex emotional phenomena. For example, visceral fac- Breazeal, C. 2002. Designing Sociable Robots. MIT Press.
tors (Loewenstein 1996) could somehow be modeled with Canamero, D. 2000. Designing emotions for activity se-
the current version of the architecture but there should be a lection. Technical report, University of Aarhus, Denmark.
mechanism that would allow an easy implementation of this Castelfranchi, C. 2000. Affective Appraisal vs Cognitive
phenomena as they are so common. Such a simple represen- Evaluation in Social Emotions and Interactions. Springer.
tation was implemented to start from the ground and earn 76–106.
some experience. In the future, a more advanced one will Damasio, A. R. 1994. Descartes’ Error: Emotion, Reason,
substitute the current one. and the Human Brain. Putnam Pub Group.
As said before, the architecture is based in a dynamic
modules and components loading system. This will allow Déniz, O., et al. 2002. Casimiro: A robot head for human-
us to easily add new components to the architecture. Some computer interaction. In Proceedings of 11th IEEE Inter-
of these planned modules are: national Workshop on Robot and Human Interactive Com-
munication.
• A long term memory module. Currently, the only memory Ekman, P. 1999. Handbook of Cognition and Emotion.
the agent has are the beliefs and they are short term. It is Sussex, U.K.: John Wiley and Sons, Ltd. chapter 3, 45–60.
clear that to produce complex emotional reactions there
Gratch, J. 1999. Why you should buy an emotional planner.
is a need for long term memory to remember past actions
In Proceedings of the Agents’99 Workshop on Emotion-
and events.
based Agent Architectures (EBAA’99).
• The action selection algorithm, as pointed out, is one of Kinny, D.; Rao, A. S.; and Georgeff, M. P. 1996. A
our main interests. As noted, we are currently develop- methodology and modeling technique for systems of BDI
ing a reconfigurable behavior network whose parameters, agents. In Agents Breaking Away: Proceedings of the Sev-
structure and runtime algorithm will be dynamically es- enth European Workshop on Modeling Autonomous Agents
tablished based on the current emotional state. in a Multi-Agent World, MAAMAW’96.
• Desires are planned to support hierarchical trees. The de- Loewenstein, G. 1996. Out of control: Visceral influences
signer will then be allowed to define complex desires that on behavior. Organizational Behavior and Human Desi-
depend on another ones. cion Making 65(3):272–292.
Maes, P. 1989. How to do the right thing. Connection
Science Journal 1(3):291–323.
Picard, R. W. 1997. Affective Computing. MIT Press.
Rao, A. S., and Georgeff, M. P. 1995. BDI agents - from
theory to practice. In Proceedings of the First Intl. Confer-
ence on Multiagent Systems.
Sloman, A. 2000. How many separately evolved emotional
beasties live within us? Technical report, The University
of Birmingham. Revised version of Invited Talk: at work-
shop on Emotions in Humans and Artifacts Vienna, August
1999.
Velásquez, J. D. 1997. Modeling emotions and other
motivations in synthetic agents. In Proceedings of AAAI-
97. 545 Technology Square, NE43-812, Cambridge, MA
02139: MIT AI Lab.
Velásquez, J. D. 1998a. Modeling emotion-based decision-
making. In Emotional and Intelligent: The Tangled Knot
of Cognition, 164–169. AAAI.
Velásquez, J. D. 1998b. When robots weep: Emotional
memories and decision-making. In Proceedings of AAAI-
98, 70–75. 545 Technology Square, NE43-935, Cam-
bridge, MA 02139: MIT AI Lab.

You might also like