AI Unit-4 Software Agents Communication
AI Unit-4 Software Agents Communication
coordination
competition cooperation
collaboration
communication
• Non-/Quasi-communicative interaction:
• Shared environment (interaction via resource/capability sharing)
• ”Pheromone” communication (ant algorithms)
• Communication:
• Information exchange: sharing knowledge, exchanging views
• Collaboration, distributed planning: optimising use of resources and
Speechact theory
• Example:
• Performative: request/inform/enquire
• Propositional content: “the window is open”
• Searle (1972) identified following categories of performatives:
• assertives/representatives (informing, making a claim)
• directives (requesting, commanding)
• commissives (promising, refusing)
• declaratives (effecting change to state of the world)
• expressives (expressing mental states)
• Ambiguity problems:
• “Please open the window!”
• “The window is open.”
• “I will open the window.”
performative
2. The procedure must be executed correctly and completely
3. The act must be sincere, any uptake must be completed as far as
possible
• Searle’s properties for success of (e.g.) a request:
1. I/O conditions (ability to hear request, normal situation)
2. Preparatory conditions must hold (requested action can be
performed, speaker must believe this, hearer will not perform action
anyway)
3. Sincerity conditions (wanting the action to be performed)
KQML/KIF
• KQML – Knowledge Query and Manipulation Language
• An “outer” language, defines various acceptable performatives
• Example performatives:
• ask-if(‘is it true that...’)
• perform (‘please perform the following action...’)
• tell(‘it is true that...’)
• reply(‘the answer is ...’)
• Message format:
(performative
:sender <word> :receiver <word>
:in-reply-to <word> :reply-with <word>
:language <word> :ontology <word>
:content <expression>)
Example
(advertise
:sender Agent1
:receiver Agent2
:in-reply-to ID1
:reply-with ID2
:language KQML
:ontology kqml-ontology
:content (ask
:sender Agent1
:receiver Agent3
:language Prolog
:ontology blocks-world
:content "on(X,Y)"))
rational effect: Bj ϕ
PROBLEM
• Here, Agent (α, j ) means that j can perform j , Done(α) means that the
action has been done
• Impossible for the speaker to enforce those beliefs on the hearer!
• More generally: No way to verify mental state of agent on the grounds
of its (communicative) behaviour
• Alternative approaches use notion of social commitments
• “A debtor a is indebted to a creditor b to perform action c (before d )”
• Often public commitment stores are used to track status of
generated commitments
• At least (non)fulfilment of commitments can be verified
Ontologies
• One aspect we have not discussed so far: how can agents ensure the
terminology they use is commonly understood?
• What are ontologies?
• philosophically speaking: a theory of nature of being or existence
• practically speaking: a formal specification of a shared
conceptualisation
• Ontologies have become a prominent are of research in particular with the
rise of the Semantic Web
• Many interesting problems: ontology matching and mapping, ontology
negotiation, ontology learning etc.
• For our purposes sufficient to know that agreement on terminology is
prerequisite for meaningful communication
Interaction protocols
• ACLs define the syntax and semantics of individual utterances
• But they don’t specify what agent conversations look like
• This is done by interaction protocols for different types of agent
dialogues
• Interaction protocols govern the exchange of a series of messages among
agents
• Restrict the range and ordering of possible messages (effectively define
patterns of admissible sequences of messages)
• Often formalised using finite-state diagrams or “interaction diagrams” in
FIPA-AgentUML
• Define agent roles, message patterns, semantic constrains
Contract-net Protocol