0% found this document useful (0 votes)
6 views155 pages

MI - Unit 2

The document covers the concepts of logical agents, propositional logic, and first-order logic, focusing on knowledge representation and inference methods. It discusses the structure of knowledge-based agents, their functions, and provides examples such as the Wumpus World to illustrate these concepts. Key topics include entailment, inference algorithms, and the syntax and semantics of propositional logic.

Uploaded by

ersenthilprabhu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views155 pages

MI - Unit 2

The document covers the concepts of logical agents, propositional logic, and first-order logic, focusing on knowledge representation and inference methods. It discusses the structure of knowledge-based agents, their functions, and provides examples such as the Wumpus World to illustrate these concepts. Key topics include entailment, inference algorithms, and the syntax and semantics of propositional logic.

Uploaded by

ersenthilprabhu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 155

19CSCN1602 – Machine Intelligence

Unit –II
Prabhu K, AP(SS)/CSE
Unit I : Logical Agents

Topic : Propositional Logic


Unit II

Logical Agents

Logical agents – Propositional logic – First order logic – syntax and semantics
of FOL–Using first order logic – knowledge engineering in FOL– Inference in
FOL– Unification and Lifting – Forward and backward chaining – Resolution.
• Apply Inference rules to the given
CO knowledge base for theorem proving

LO • Explain Knowledge based Agent

SO • Design generic KB Agent

Topic • Logical Agent


Logical Agent
Knowledge representation

• Knowledge: Facts, information and skills acquired through experience or


education

• Knowledge representation: Representing information about the real


world in some formal language

• It also enables an intelligent machine to learn from that knowledge and


experiences so that it can behave intelligently like a human
Logical Agent

• An agent can represent knowledge of its world, its goals and the current
situation

• Logical agent has a collection of sentences in logic

• By using these logical sentences, the agent decides what to do by


inferring knowledge(conclusion)

• The conclusions are achieved by certain action or set of actions is


appropriate to achieve its goals

• Knowledge and reasoning are important to logical agents because they


enable successful behaviors to achieve a desired goal
Knowledge-based agents

• Central component of a knowledge based agent is a


knowledge base(KB)
• KB contains a set of sentences in a formal language
• Sentences are expressed using a knowledge representation
language
• Two generic functions
• TELL - add new sentences (facts) to the KB “Tell it what it
needs to know”
• ASK - query what is known from the KB “Ask what to do
next”
Example

• TELL: Father of John is Bob


• TELL: Jane is John’s sister
• TELL: John’s Father is the same as John’s sister’s
father
• ASK: Who is Jane’s Father?
Knowledge bases

• The agent must be able to:

• Represent states, actions, etc.

• Incorporate new percepts

• Update internal representations of the world

• Deduce hidden properties of the world

• Deduce appropriate actions


A simple knowledge-based agent

• Construct a sentence with assertion about percepts


• Construct a sentence asking what action is next
• Constructs a sentence asserting that action
Example for Knowledge Based Agent –
Wumpus World
• The Wumpus World is a cave consisting of rooms (4 X 4) connected by pathways.

• The Wumpus hiding somewhere in the cave, and that eats anyone who enters its room.

• The Wumpus can be shot by an agent, but the agent has only one arrow.

• Some rooms contain bottomless pits that will trap anyone who enters into these rooms.

• The only goal in this environment is the possibility of finding a bundle of gold
Wumpus World PEAS description
• Environment: 4 x 4 grid of rooms
• Squares adjacent to wumpus are smelly
• Squares adjacent to pit are breezy
• Glitter iff gold is in the same square
• Shooting kills wumpus if you are facing it
• Shooting uses up the only arrow
• Grabbing picks up gold if in same square
• Releasing drops the gold in same square
• Performance measure
• gold +1000, death -1000
• -1 per step, -10 for using the arrow
• Sensors: Stench, Breeze, Glitter, Bump, Scream (shot Wumpus)
• Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot
Wumpus world characterization

• Observable : Partial – only local perception

• Deterministic :Yes – outcomes exactly specified

• Episodic: :No – sequential at the level of actions

• Static : Yes – Wumpus and Pits do not move

• Discrete :Yes

• Single-agent :Yes – Wumpus is essentially a natural feature


Wumpus World

• Percepts given to the agent


1. Stench
2. Breeze
3. Glitter
4. Bumb (ran into a wall)
5. Scream (wumpus has been hit by arrow)

Principle Difficulty: agent is initially ignorant of the configuration of the


environment – going to have to reason to figure out where the gold is without
getting killed!
Exploring a wumpus world
Percepts given to the agent
1. Stench
2. Breeze
3. Glitter
4. Bumb (ran into a wall)
5. Scream (wumpus has been hit by arrow)

Sensors: {[Stench, Breeze, Glitter, Bump, Scream ]}

The first percept is [None,None,None,None,None]


1,4 2,4 3,4 4,4

1,3 2,3 3,3 4,3

1,2 2,2 3,2 4,2


OK

1,1 2,1 3,1 4,1


A OK

1,4 2,4 3,4 4,4

1,3 2,3 3,3 4,3

1,2 2,2 3,2 4,2


OK
P?
1,1 2,1 3,1 4,1
V A
P?
OK OK
B

The second percept is: [None,Breeze,None,None,None]


1,4 2,4 3,4 4,4

1,3 2,3 3,3 4,3

1,2 2,2 3,2 4,2


OK P?
1,1 2,1 A 3,1 4,1
V
P?
OK OK B

1,4 2,4 3,4 4,4

1,3 W? 2,3 3,3 4,3

1,2 2,2 3,2 4,2


OK OK
S
A
V
1,1 2,1 3,1 P? 4,1
OK V
OK
B

The Third percept is: [Stench,None,None,None,None]


1,4 2,4 3,4 4,4 1,4 2,4 3,4 4,4

1,3 2,3 3,3 4,3


1,3 2,3 3,3 4,3
1,2 2,2 3,2 4,2
OK P?
1,2 2,2 3,2 4,2
OK 1,1 2,1 A 3,1 4,1
V
P?
1,1 2,1 3,1 4,1 OK OK B
A OK

1,4 2,4 3,4 4,4 1,4 2,4 3,4 4,4

P?
1,3 W? 2,3 3,3 4,3 1,3 W? 2,3 3,3 4,3
G A
S P?
B
1,2 2,2 3,2 4,2
OK A 1,2 S 2,2 3,2 4,2
OK OK
S OK
V V
V
1,1 2,1 3,1 4,1
1,1 2,1 3,1 P? 4,1
V OK OK
OK V P?
V
OK B
B

The Fourth percept is: [Stench,Breeze,Glitter,None,None]


Logic in general
• Logics are formal languages for representing information such that conclusions
can be drawn

• Syntax defines how symbols can be put together to form the sentences in the
language Ex: x+4=6 is correct but 4x=6+ is wrong

• Semantics define the "meaning" of sentences;

• i.e., define truth of a sentence in a world (given an interpretation)

• E.g., the language of arithmetic

• x+2 ≥ y is a sentence; x2+y > {} is not a sentence

• x+2 ≥ y is true iff the number x+2 is no less than the number y

• x+2 ≥ y is true in a world where x = 7, y = 1

• x+2 ≥ y is false in a world where x = 0, y = 6


• Ontological Commitment – What exists in the world
• Epistemological Commitment- What an agent believes about facts
Entailment
• Entailment is the relationship between two sentences where the truth of
one (A) requires the truth of the other (B)

• Ex: the sentence (A) The president was assassinated entails (B )The
president is dead A ╞ B

• Entailment means that one thing follows logically from another:

KB ╞ α

• Knowledge base KB entails sentence α if and only if α is true in all worlds


where KB is true
• E.g., the KB containing “the Phillies won” and “the Reds won” entails
“Either the Phillies won or the Reds won”
• E.g., x+y = 4 entails 4 = x+y
Entailment Example

• (x=0) ╞(xy=0)
• (p=True) ╞(p∨q)
• (p∧q) ╞ (p∨q)
• ((p⇔q) ∨r) ╞(q⇒p))
• (q ⇒p) ╞(p⇔q) ∨r)?
Models
• Logicians typically think in terms of models, which are formally structured
worlds with respect to which truth can be evaluated

• m is a model of a sentence α if α is true in m. Ex m is xy=0 ,α :x=0,y=12

• M(α) is the set of all models of α

• Then KB ╞ α iff M(KB)  M(α)

E.g.

KB = Phillies won and Yankees won

α = Phillies won
Entailment in the wumpus world

• Situation after detecting nothing in [1,1], moving right, breeze in [2,1]

• Consider possible models for KB assuming only pits

• 3 Boolean choices  8 possible models


Wumpus possible models
Wumpus models

KB = wumpus-world rules + observations


Wumpus models

KB = wumpus-world rules + observations


α1 = “there is no pit in [1,2]", KB ╞ α1, proved by model checking
Wumpus models

KB = wumpus-world rules + observations


α2 = “there is no pit in [2,2]", KB ╞ α2
The agent cannot conclude that there is no pit in [2,2]
Inference and Entailment
• Inference is a procedure that allows new sentences to be derived from a
knowledge base.

• Understanding inference and entailment: think of

• Set of all consequences of a KB as a haystack

• α as the needle

• Entailment is like the needle being in the haystack

• Inference is like finding it


Inference algorithm

• KB ├i α = sentence α can be derived from KB by inference procedure i

• Soundness: An inference algorithm that derives only entailed sentence

• Completeness: An inference algorithm can derive any sentence that is


entailed

• If KB is true in the real world ,then any sentence α derived from KB by a


sound inference procedure is also true in the real world
Inference

• Sentences are physical configurations of the agent and reasoning is a


process of constructing new physical configurations from old ones
• Logical reasoning should ensure that the new configurations represent
aspects of the world that actually follow from the aspects that the old
configurations represent
Propositional logic
Propositional logic
• Represent facts which are either true or false
• The propositional logic is also called as Boolean Logic.
• The sentence / statement is declarative, which is either true or false, but can
not be both.
• Questions, opinion, and comma(,) are not allowed in propositional logic.
• E.g.
• Students are studying in college (True Proposition)
• 5 + 3 = 8 (True Proposition)
• print(“Hello World”) (True Proposition)
• 4 + 2 = 5 (False Proposition)
• What is your name? (not accepted / syntax error)
• Some students are intelligent (false proposition because this declares both true /
false) (not accepted / syntax error)
Question

a) There are 7 days in a week

b) The sun rises in the west

c) 3+3=7

d) How are you?

e) What is your name?


Translation of English sentence to Propositional
logic
A Formal Grammar of Propositional Logic

A BNF (Backus-Naur Form) grammar of sentences in propositional logic


Truth tables for connectives
Propositional logic: Syntax

• Syntax - defines the allowable sentences


• The atomic sentences- the individual syntactic elements-
consist of a single proposition symbol.
• Each such symbol stands for a proposition that can be true
or false.
• We will use uppercase names for symbols: P, Q, R, and so on.
• For example,
• W1,3 stand for the proposition that the Wumpus is in [1,3].
Propositional logic: Syntax
• Complex sentences : constructed from simpler sentences using logical
connectives (logical opeators)
• ¬ (not) : called the negation, A literal is either an atomic sentence (a
positive literal) or a negated atomic sentence (a negative literal).
• ˄ (and) : A sentence whose main connective is called a conjunction; its
parts are the conjuncts.
• V (or) : a disjunction of the disjuncts

• ⇒ (implies) : is called an implication (or conditional).

• ⇔ (if and only if) : The sentence is a biconditional.


Example

• The proposition symbols P1, P2 etc are (atomic) sentences

• If S is a sentence, (S) is a sentence (negation)

• If S1 and S2 are sentences, (S1  S2) is a sentence (conjunction)

• If S1 and S2 are sentences, (S1  S2) is a sentence (disjunction)

• If S1 and S2 are sentences, (S1  S2) is a sentence (implication)

• If S1 and S2 are sentences, (S1  S2) is a sentence (biconditional)


Propositional logic: Semantics
• The semantics defines the rules for determining the truth of a sentence with
respect to a particular model.
• Each model specifies true/false for each proposition symbol
E.g. P1,2 P2,2 P3,1
false true false
Rules for evaluating truth with respect to a model m:
S is true iff S is false
S1  S2 is true iff S1 is true and S2 is true(conjunction)
S1  S2 is true iff S1is true or S2 is true(disjunction)
S1  S2 is true iff S1 is false orS2 is true(implication)
i.e., is false iff S1 is true andS2 is false
S1  S2 is true iff S1S2 is true andS2S1 is
true(biconditional)

Simple recursive process evaluates an arbitrary sentence, e.g.,


P1,2  (P2,2  P3,1) = true  (true  false) = true  true = true
Example

• A-It is hot
• B-It is humid
• C-It is raining
• Conditions:
• If it is humid,then it is hot
• B->A
• If it is hot and humid then it is not raining
• A∧B->⏋C
A simple KB Rules :Wumpus world
sentences
Px,y is true if there is a pit in [x,y]

Wx,y is true if there is a wumpus in [x,y],dead or alive

Bx,y is true if the agent perceives a breeze in [x,y]

Sx,y is true if the agent perceives a stench in [x,y]

There is no pit in [1,1]


A square is breezy if and only if there is a pit in a neighboring square
R1: P1,1
R2:B1,1  (P1,2  P2,1)
R3:B2,1 (P1,1  P2,2  P3,1)
Now after visiting [1,1],[1,2] and [2,1]
R4:B1,1
R5:B2,1

KB= R1∧R2 ∧ R3 ∧ R4 ∧ R5
Inference Goal

• Want to find whether KB says there is no pit in [1,2]

• Does KB ╞  P1,2?

• We say  P1,2 is a sentence α

• Main Goal: decide whether KB ╞ α


• α can be a much more complex query

• In the Wumpus world there are 7 relevant symbols:

• B1,1, B2,1, P1,1, P1,2, P2,1, P2,2, P3,1

• 27 = 128 models.
• For each model check that : if it is true in α it has to be true in KB
Inference :All possible models

Does KB ╞  P1,2?
Only three are true
First order logic
Drawbacks of propositional logic
• Propositional logic is very simple and declarative

• Propositional logic has lack of data structures in programming

• Propositional logic is compositional:

• meaning of B1,1 ∧P1,2 is derived from meaning of B1,1 and of P1,2

• Meaning in propositional logic is context-independent

• Propositional logic is not sufficient for complex or natural language sentences. ie.
We cannot represent relations like ALL,Some or none

• E.g. Some students in MCET are intelligent

• Propositional logic has very limited expressive power

• E.g., cannot say "pits cause breezes in adjacent squares“

• except by writing one sentence for each square


First-order logic

• First order logic is also known as predicate logic or first order predicate
logic

• First-order logic, like natural language has well defined syntax and
semantics.

• First order logic (like natural language) assumes the world contains

• Objects : people, houses, numbers, colors, baseball games, wars

• Relations: red, round, prime, brother of, bigger than ,part of, comes
between…
• Functions: father of, best friend, one more than, plus, …
Properties of FOL

• It has ability to represent facts about some or all of the objects and
relations in the universe

• Represent law and rules extracted from real world

• Useful language for Maths ,Philosophy and AI

• Represent facts in realistic manner rather than just true or false

• Makes Ontological commitment


Syntax of FOL: Basic elements

• Constants KingJohn, 2, NUS,...


• Predicates Brother, >,...
• Functions Sqrt, LeftLegOf,...
• Variables x, y, a, b,...
• Connectives, , , , 
• Equality =
• Quantifiers , 
• Sentences Atom/complex sentences
• Atom True/False/AP(Atomic Proposition)
• Complex Sentence (sentence)/connective sentence/ sentence
Term

• A term is a logical expression that refers to an object.

• Constant symbols are therefore terms, but it is not always convenient to


have a distinct symbol to name every object.

• For example, in English we might use the expression "King John's left leg"
rather than giving a name to his leg.

• This is what function symbols are for: instead of using a constant symbol,
we use LeftLeg(John).
Atomic sentences

Atomic sentence =predicate (term1,...,termn) or term1 = term2

term =function (term1,...,termn) or constant or variable

• E.g.,
• Brother(KingJohn,RichardTheLionheart)
• > (Length(LeftLegOf(Richard)), Length(LeftLegOf(KingJohn)))

• Atomic sentence with complex terms

• Ex:Married (Father( Richard), Mother( John))

• An atomic sentence predicate(term1,...,termn) is true iff the objects referred to

by term1,...,termn are in the relation referred to by predicate


Complex sentences

• Complex sentences are made from atomic sentences using connectives


 S, S1 ∧ S2, S1 ∨ S2, S1 ⇒ S2, S1 ⇔ S2
• Examples
• Sibling(KingJohn,Richard)  Sibling(Richard,KingJohn)
• >(1,2)  ≤ (1,2)
• >(1,2)   >(1,2)
• Brother(LeftLeg(Richard),John)
Brother(Richard,John)∧Brother(John,Richard)
King(Richard)∨King(John)
King(Richard)⇒King(John)
Truth in first-order logic
• Syntactic elements of first order logic are the symbols that stand for objects,
relations and functions

• Symbols three kinds

• constant symbols → objects

• predicate symbols → relations

• function symbols → functional relations

• Sentences are true with respect to a model and an interpretation

• Model contains objects (domain elements) and then relations among them
and functions

• Every model provide information required to determine if any given


sentence is true or false
Symbols and interpretations

• Each model includes an interpretation that specifies exactly which objects,


relations and functions are referred to by the constant, predicate and
function symbols

• Intended interpretation

• Richard refers to RichardtheLionheart and John referes to the evil King


John
• Brother refers to the brotherhood relation

• LeftLeg refers to the left leg function

• There are five objects in the model so there are 25 possible interpretations
for the constant symbols Richard and John
Models for FOL: Example

Richard the Lionheart was a King of


England from 1189 to1199;
his younger brother was the evil King
John, who ruled from 1199 to 1215;
The left legs of Richard and John were
different;
And John had a crown(Because he was
a king).
Models for FOL
• Objects
• Person king John
• Person Richard
• Crown
• Left leg of John
• Left leg of Richard
• Relation
• onhead<the crown,king john>
• brother<John,Richard>
• Person<Richard>
• King<John>
• Function
• [no other person wears crown
except the king]
• <John the king>-onhead(crown)
• <John the king>-shoe(left-leg)
Semantics of FOL

• An semantic of a FOL assigns a notation to all symbols

• It also determines a domain that specifies the range of the


quantifiers

• Each term is assigned an object

• Each predicate is assigned a property of objects

• Each sentence is assigned a truth value

• In this way FOL provides meaning to the terms , the predicates and
formulas of the language
Quantifiers

• A quantifier is a language element which generates quantification

• These are the symbols that permit to determine or identify the range
and scope of the variable in the logic expression

• There are two types of quantifiers

• Universal quantifier(for all,everyone,everything)

• Existential quantifier(for some,atleast one)


Universal quantification
• It is a symbol of logical representation which specifies that the statement
within its range is true for everything or every instance of particular thing
• It is represented by the symbol ∀
• Syntax: ∀<variables> <sentence>
• “Everyone at MCET is smart”: ∀ x At(x,MCET) ⇒ Smart(x)
• "All kings are persons” : ∀ x King(x) ⇒ Person(x)
• ∀ x P is true in a model m iff P is true with x being each possible object in the
model
• x P equivalent to the conjunction of instantiations of P
• Ex: At(KingJohn,MCET)  Smart(KingJohn)
 At(Richard, MCET)  Smart(Richard)
 At(Jane, MCET)  Smart(Jane)
 ...
A common mistake to avoid

• Typically, ⇒ is the main connective with ∀

• Common mistake: using ∧ as the main connective with ∀

• ∀ x At(x,MCET) ∧ Smart(x)

• means “Everyone is at MCET and everyone is smart” is mistake


because we want “Everyone at MCET are smart”
Existential quantification
• It is a type of quantifier which express that the statement within its
scope is true for atleast one instance of something
• It is denoted by ∃
• Syntax: ∃<variables> <sentence>
• Someone at MCET is smart: ∃x At(x, MCET) ∧ Smart(x)
• ∃ x P is true in a model m iff P is true with x being some possible object in
the model
• “King John has a crown on his head” : ∃ x Crown(x) ∧ OnHead(x, John)
• ∃ x P equivalent to the disjunction of instantiations of P
• Ex:
At(KingJohn, MCET)  Smart(KingJohn)
 At(Richard, MCET)  Smart(Richard)
 At(MCET, MCET)  Smart(MCET)
 ...
Another common mistake to avoid

• Typically, ∧is the main connective with ∃

• Common mistake: using ⇒as the main connective with ∃

• ∃ x At(x, MCET) ⇒ Smart(x)

• Means “ there is anyone who is not at MCET is smart” is mistake


because we want “Someone at MCET is smart”
Properties of quantifiers
• ∀x ∀ y is the same as ∀ y ∀ x

• ∃x ∃ y is the same as ∃ y ∃ x

• ∃ x ∀ y is not the same as ∀ y ∃ x

• ∃ x ∀ y Loves(x,y)

• “There is a person who loves everyone in the world”

• ∀ y ∃ x Loves(x,y)

• “Everyone in the world is loved by at least one person”

• Quantifier duality: each can be expressed using the other

• x Likes(x,IceCream) x Likes(x,IceCream)


• x Likes(x,Broccoli) x Likes(x,Broccoli)
Equality

• Equality symbol signifies that two terms refer to the same object

• term1 = term2 is true under a given interpretation if and only if term1


and term2 refer to the same object
• Father(John)=Henry

• To say that Richard has at least two brothers

•  x, y Brother(x, Richard)  Brother(y, Richard)  (x =y) .


Examples
• All birds fly

• Predicate is fly(bird)

• ∀x bird(x)->fly

• Every man respects his parent

• Predicate is respect(x,y)

where

x=man

y=parent
• ∀y man(x)->respects(x,parent)

• Some boys are intelligent

• ∃ x boy(x) ∧intelligent(x)
Nested Quantifiers

• Same type / Different type Quantifiers


• Brothers are siblings
∀x ∀y Brother(x,y) ⇒Sibling(x,y) .

• Everybody Loves somebody


∀ x ∃ y Loves(x, y) .
• There is someone who is loved by everyone
∃ y ∀ x Loves(x, y) .
• Quantifier Duality
• “ Everyone likes Icecream “ ⇔“There is no one who dislikes icecream”
USING FIRST-ORDER LOGIC
Assertions and queries in first-order logic
• Sentences are added to a knowledge base using TELL. Such sentences are called
assertions
• TELL(KB, King(John))
• TELL(KB, Person(Richard))
• TELL(KB, ∀x King(x)⇒ Person(x))
• ask questions of the knowledge base using ASK
• ASK ( KB, King( John))
• returns true
• Questions asked with ASK are called queries or goals
• quantified queries
• ASK(KB, ∃ x Person(x))
• ASKVARS(KB, Person(x))
• Returns stream of answers like {x/ John} and {x/Richard} and the answer is
called substitution or binding list
The kinship domain
• domain of family relationships, or kinship
• This domain includes facts such as
• "Elizabeth is the mother of Charles"
• "Charles is the father of William"
• rules such as
• "One's grandmother is the mother of one's parent.“
• two unary predicates, Male and Female
• Kinship relations-parenthood, brotherhood, marriage, and so on are
represented by binary predicates:
• Parent, Sibling, Brother, Sister, Child, Daughter, Son, Spouse, Wife,
Husband, Gmndparent, Gmndchild, Cousin, Aunt, and Uncle.
• We use functions for Mother and Father,
The kinship domain
• one's mother is one's female parent
• ∀ m, c Mother( c)= m ⇔ Female(m) ∧ Parent(m, c)
• One's husband is one's male spouse:
• ∀ w,h Husband(h,w) ⇔ Male(h) ∧ Spouse (h, w)
• Male and female are disjoint categories:
• ∀ x Male(x) ⇔  Female(x) .
• Parent and child are inverse relations:
• ∀p,c Parent(p,c) ⇔ Child(c,p)
• A grandparent is a parent of one's parent:
• ∀ g , c Grandparent(g,c) ⇔ ∃ p Parent(g,p) ∧ Parent(p,c)
• A sibling is another child of one's parents:
• ∀ x, y Sibling(x, y ) ⇔ x # y ∃ p Parent(p, x) ∧ Parent(p, y)
The wumpus world
• First-order sentence stored in the knowledge base must include both the
percept and the time at which it occurred
• Percept([stench, breeze, glitter, none, none], 5) .

• Actions in the wumpus world can be represented by logical terms

• Turn(right), turn(left), forward, shoot, grab, climb .

• To determine which is best, the agent program executes the query

• AskVars(∃a BestAction(a,5))

• Which returns a binding list such as { a/ grab}

• The agent program can then return grab as the action to take
The wumpus world
• Can encode raw percept

• ∀t,s,g,m,c Percept([s,Breeze,g,m,c],t) ⇒ Breeze(t)

• ∀ t,s,b,m,c Percept([s,b, Glitter,m,c],t) ⇒ Glitter(t)

• Reflex Actions: ∀ t Glitter(t) ⇒ BestAction(Grab,t) .

• Instead of encoding stuff like

• Adjacent( square1,2,square1,1)

• Adjacent(square3,4,square4,4)

• Encode

• ∀ x,y,a,b Adjacent([x,y], [a,b]) ⇔(x =a ∧ (y = b - 1 V y = b + 1)) ∨ (y = b ∧ (x


=a- 1 ∨ x =a+ 1)) .
KNOWLEDGE ENGINEERING IN FIRST-ORDER LOGIC
KNOWLEDGE ENGINEERING IN FIRST-ORDER
LOGIC

• A knowledge engineer is someone who investigates a particular domain,


learns what concepts are important in that domain, and creates a formal
representation of the objects and relations in the domain
• General purpose knowledge base
• Support queries about full range of human knowledge.
• In this we can expect any kind of query, which knowledge base will have
to infer.
• Special purpose knowledge base
• Which has restricted domain (problem specific), here expected queries
are known in advance.
Steps in knowledge-engineering process

1. Identify the task


2. Assemble the relevant knowledge
3. Decide on a vocabulary of predicates, functions, and constants
4. Encode general knowledge about the domain
5. Encode a description of the specific problem instance
6. Pose queries to the inference procedure and get answers
7. Debug the knowledge base
The electronic circuits domain

One bit- full adder


1.Identify the task

• Identify the task similar to PEAS design.

• Knowledge engineer must describe the range of question that the KB will
support

• Find the facts that available for each specific problem instance

• Does the circuit actually add properly? (circuit verification)


2.Assemble the relevant knowledge

• Composed of wires and gates;

• Types of gates (AND, OR, XOR, NOT)

• Irrelevant: size, shape, color, cost of gates


3.Decide on a vocabulary
• Translate the important domain level concepts into logic level names.
• Once the choice among predicates, functions and constants have been made, the
result is vocabulary, which is Ontology of domain.
• Type(X1) = XOR • Type(A2) = AND
• Type(X1, XOR) • Type(A2, AND)
• XOR(X1) • AND(A2)
• Type(X2) = XOR
• Type(X2, XOR) • Type(O1) = OR
• XOR(X2) • Type(O1, OR)
• Type(A1) = AND • OR(O1)
• Type(A1, AND)
• AND(A1)
4.Encode general knowledge of the domain
• ∀t1,t2 Connected(t1, t2) ⇒ Signal(t1) = Signal(t2) (t=terminal, g=gate)
• ∀t Signal(t) = 1 ∨ Signal(t) = 0, 1 ≠ 0
• ∀t1,t2 Connected(t1, t2) ⇒ Connected(t2, t1)
• ∀g Type(g) = OR ⇒ Signal(Out(1,g)) = 1 ⇔∃n Signal(In(n,g)) = 1
• ∀g Type(g) = AND ⇒ Signal(Out(1,g)) = 0 ⇔∃ n Signal(In(n,g)) = 0
• ∀g Type(g) = XOR ⇒ Signal(Out(1,g)) = 1 ⇔ Signal(In(1,g)) ≠ Signal(In(2,g))
• ∀g Type(g) = NOT ⇒ Signal(Out(1,g)) ≠ Signal(In(1,g))
5.Encode the specific problem instance
• Type (X1) = XOR Type(X2) = XOR
• Type(A1) = AND Type(A2) = AND
• Type(O1) = OR Type(C1) = Circuit
• Connected(Out(1,X1),In(1,X2)) Connected(In(1,C1),In(1,X1))
• Connected(Out(1,X1),In(2,A2)) Connected(In(1,C1),In(1,A1))
• Connected(Out(1,A2),In(1,O1)) Connected(In(2,C1),In(2,X1))
• Connected(Out(1,A1),In(2,O1)) Connected(In(2,C1),In(2,A1))
• Connected(Out(1,X2),Out(1,C1)) Connected(In(3,C1),In(2,X2))
• Connected(Out(1,O1),Out(2,C1)) Connected(In(3,C1),In(1,A2))
6.Pose queries to the inference procedure
• What are the possible sets of values of all the terminals for the adder circuit?
• ∃i1,i2,i3,o1,o2 Signal(In(1,C1)) = i1 ∧ Signal(In(2,C1)) = i2 ∧ Signal(In(3,C1)) =
i3 ∧ Signal(Out(1,C1)) = o1 ∧ Signal(Out(2,C1)) = o2
• There are substitution of variables i1,i2,i3 with values (1/0).
• The final query will return complete with given Input and Output for the device.
• It should be used to check that add inputs correctly.
• This is called as circuit verification.
7. Debug the knowledge base

• We have to see the knowledge base in different ways


• System unable to give output in no signals
• If all inputs are 000, then the output also 00,and etc.
• May have omitted the assertions like 1 ≠ 0
Unit II

Logical Agents

Logical agents – Propositional logic – First order logic – syntax and semantics
of FOL–Using first order logic – knowledge engineering in FOL– Inference in
FOL– Unification and Lifting – Forward and backward chaining – Resolution.
Inference in FOL
Inference in FOL

• Inference in FOL is used to generate new sentences from existing


sentences

• Definition

• An expression X logically follows from a set S, if every interpretation


that satisfies S also satisfies X
• The function of logical inference is to produce new sentence that
logically follow a given set of FOL sentence
Universal Instantiation(UI)/ Universal
Elimination(UE)

• UI says that we can infer (produce) any sentence obtained by


substituting a ground term for the variable

• We use the notion of Substitutions for those instantiations

• Let SUBST(Θ,α) denote the result of applying the substitution Θ to the


sentence α

v α
SUBST({v/g}, α)

for any variable v and ground term g


Substitutions

• Eg.KB contains “all greedy kings are evil”

• ∀x King(x)  Greedy(x)  Evil(x) yields (eliminate ∀)

• SUBST(x/John)

• King(John)  Greedy(John)  Evil(John)

• SUBST(x/Richard)

• King(Richard)  Greedy(Richard)  Evil(Richard)

• SUBST(x/Father(John))

• King(Father(John))  Greedy(Father(John))  Evil(Father(John))


Existential instantiation(EI)/ Existential
Elimination
• For any sentence α, variable v, and constant symbol k that does not
appear elsewhere in the knowledge base:

v α
SUBST({v/k}, α)

• Eg: x Crown(x)  OnHead(x,John) (yields Eliminate ∃)

• Crown(C1)  OnHead(C1,John)

• Skolem constant
• provided C1 is a new constant symbol which is not in KB but satisfy
all properties of ‘x’ called a Skolem constant
• Skolemization-replacing variables with ground terms
Reduction to propositional inference
Suppose the KB contains just the following:

x King(x)  Greedy(x)  Evil(x)

King(John)

Greedy(John)

Brother(Richard,John)
• Instantiating the universal sentence in all possible ways, we have:

King(John)  Greedy(John)  Evil(John)

King(Richard)  Greedy(Richard)  Evil(Richard)

King(John)

Greedy(John)

Brother(Richard,John)
• The new KB is propositionalized: proposition symbols are
King(John), Greedy(John), Evil(John), King(Richard), etc.
Problem with propositionalization

• Propositionalization seems to generate lots of irrelevant sentences

• King(Richard)  Greedy(Richard)  Evil(Richard)

• With function symbols ,there are infinite ground terms

• Father(Father(Father(John)))
Unit II

Logical Agents

Logical agents – Propositional logic – First order logic – syntax and semantics
of FOL–Using first order logic – knowledge engineering in FOL– Inference in
FOL– Unification and Lifting – Forward and backward chaining – Resolution.
Unification and Lifting
Unification
• Unification is a kind of binding logic between two or more variables

• In propositional logic it is easy to determine that two literals can not both be
true at the same time

• Eg:Man(sunil) and Man(sunil) is a contradiction

• While Man(sunil) and Man(Arun ) is not

• In predicate logic this matching process is more complicated, since bindings


of variables must be considered

• In predicate logic in order to determine contradiction we need a matching


procedure that compares two literals and discovers whether there exist a set
of substitutions that make them identical
Unification
• The goal of unification is to make two expressions look like identical by using
substitution.
• It means the meaning of the sentence should not be changed ,but it should be
expressed in multiple ways
• Unification is an algorithm for determining the substitutions needed to make
two FOL expressions match
• The UNIFY algorithm in unification takes two sentences as input and then
returns a unifier if one exist
• Substitution means replacing one variable with another term
• It takes two literals as input and makes them identical using substitution
• It returns fail if the expressions do not match with each other
UNIFY(p,q)=Θ where SUBST(Θ,p)=SUBST(Θ,q)
Example

UNIFY(p,q)=Θ where
SUBST(Θ,p)=SUBST(Θ,q)
• We ask ASKVARS(knows(john,x)) whom does John know?

• UNIFY(knows(John,x),Knows(John,Jane))={x/Jane}

• UNIFY(Knows(john,x),Knows(y,Bill))={y/John,x/Bill}

• UNIFY(Knows(John,x),Knows(y,Mother(y)))={y/
John,x/Mother(John))}
Example
• Lets say there are two different expressions

• P(x,y) and P(a,f(z))

• We need to make both above statements identical to each other

• Perform substitution

• P(x,y)---- ( i)

• P(a,f(z))….. (ii)

• Substitute x with ‘a’ and y with ‘f(z)’ in the first expression and it will be
represented as a/x and f(z)/y

• With both the substitutions the first expression will be identical to the
second expression and the substitution set will be [a/x, f(z)/y].
Example 1

• Given Knows(Ram,x) is a predicate

• Whom does Ram knows?

• The UNIFY algorithm will search all the related sentences in the knowledge base which

could unify with Knows(Ram,x)


• UNIFY(Knows(Ram,x),Knows(Ram,Shyam))={x/Shyam}

• UNIFY(Knows(Ram,x),Knows(y,Aakash))={x/Aakash,y/Ram}

• UNIFY(Knows(Ram,x),Knows(x,Raman)=fails

• The last one failed because we have used the same variable for two persons or the two

sentences happen to use the same variable name x


• Unifications are attempted only with sentences that have some chance of unifying

• For example there is no point in trying to

• UNIFY knows(Ram , x) with Brother(Laxman, Ram)


Example 2

P q θ
Knows(John,x) Knows(John,Jane) {x/Jane}

Knows(John,x) Knows(y,Jack) {x/Jack,y/John}

Knows(John,x) Knows(y,Mother(y)) {y/John,x/Mother(John)}

Knows(John,x) Knows(x,Jack) Fail


Conditions for Unification
• Predicate symbol must be same, atoms or expression with different
predicate symbol can never be unified
• Tryassasinate(Marcus,Caesar)

• Hate(Marcus,Caesar)

• Number of arguments in both expressions must be identical

• Hate(Marcus)

• Hate(Marcus,Caesar)

• Unification will fail if there are two similar variables present in the same
expression

• UNIFY(Knows(Ram,x),Knows(x,Ram)=fails
Unification algorithm
Unification algorithm
A first order inference rule

• Suppose KB

• ∀x King(x) ∧ Greedy(x) ⇒Evil(x)

• King( John)

• Greedy (John)

• Apply UI using {x/John} and {y/John}


Inference
Generalized Modus Ponens

• For atomic sentences pi, pi', and q, where there is a substitution Θ such
that SUBST(Θ,p') = SUBST(Θ,pi), for all i,

p1', p2', ••• , pn', (p1 ∧p2 ∧.. ∧ pn⇒q)

SUBST(Θ,q)
p1' is King( John) p1 is King(x)

p2 ' is Greedy(y) p2 is Greedy(x)

Θ is {x/John, y/ John} q is Evil(x)


SUBST(Θ,q) is Evil( John)
• p ╞ SUBST(Θ,p)
Unit II

Logical Agents

Logical agents – Propositional logic – First order logic – syntax and semantics
of FOL–Using first order logic – knowledge engineering in FOL– Inference in
FOL– Unification and Lifting – Forward and backward chaining – Resolution.
Forward chaining
Inference Engine

• The inference engine is the component of the intelligent system in


artificial intelligence, which applies logical rules to the Knowledge Base
to infer new information from known facts.

• The first inference engine was part of the expert system.

• Inference engine commonly proceeds in two modes, which are:

• 1. Forward chaining 2. Backward chaining


Horn Clause and Definite clause

• Horn clause and definite clause are the forms of sentences, which enables
knowledge base to use a more restricted and efficient inference algorithm.

• Logical inference algorithms use forward and backward chaining approaches,


which require KB in the form of the first-order definite clause.

• Definite clause: A clause which is a disjunction of literals with exactly one


positive literal is known as a definite clause or strict horn clause.

• Horn clause: A clause which is a disjunction of literals with at most one positive
literal is known as horn clause. Hence all the definite clauses are horn clauses.

• Example: (¬ p V ¬ q V k). It has only one positive literal k.

• It is equivalent to p ∧ q → k.
Forward chaining
• It is also known as a forward deduction or forward reasoning method
when using an inference engine.

• It starts with the available data and uses inference rules to extract more
data until a goal is reached

• What will happen next?

• It is a bottom up approach , as it moves from bottom to top.

• It is data driven because the reasoning starts from a set of data


Forward chaining-Example

• Example
• It is raining --A
• If it is raining, the street is wet --A⇒B
• The street is wet ---B
• Conclude from “A” and “A implies B” to “B”
A
A⇒ B
B
What do you mean by forward chaining algorithm?

• It is a process of making a conclusion based on known facts or data, by


starting from the initial state and reaches the goal state.

• The Forward-chaining algorithm start from the known facts

• it triggers the entire rules whose premises are satisfied

• adding their conclusions to the known facts

• The process repeats until the query is answered or no new facts are called.
Example

• "As per the law, it is a crime for an American to sell weapons to


hostile nations. Country Nono, an enemy of America, has some
missiles, and all the missiles were sold to it by Colonel West, who is
an American citizen.“
• Prove that " Colonel West is criminal.“
• To solve the above problem, first, we will convert all the above facts into
first-order definite clauses.
Facts Conversion into FOL
• It is a crime for an American to sell weapons to hostile nations (Let's
say x, y, and z are variables)

• American (x) ∧ weapon(y) ∧ sells (x, y, z) ∧ hostile(z) ⇒ Criminal(x)------


(1)

• Country Nono has some missiles

• Ǝx Owns(Nono, x) ∧ Missile(x)

• It can be written in two definite clauses by using Existential Instantiation,


introducing new Constant M1.
• Owns(Nono, M1)---- 2

• Missile(M1)---- 3
Facts Conversion into FOL
All of the missiles were sold to country Nono by Colonel West
• ∀x Missiles(x) ∧ Owns (Nono , x) ⇒Sells (West, x, Nono)

• Can be written as
Missiles(M1) ∧ Owns (Nono , M1) ⇒Sells (West, M1, Nono) .......(4)

Missiles are weapons.


• Missile(x) ⇒ Weapons (x) ........(5)
Enemy of America is known as hostile.
• Enemy(x, America) ⇒ Hostile(x) ........(6)
Country Nono is an enemy of America.
• Enemy (Nono, America) .......(7)
West is American
• American(West). ........(8)
Forward chaining algorithm
Forward Chaining Proof
• Step-1-Facts
• In the first step , start with the known facts and choose the sentences which
do not have implications, such as:
• American(West), Enemy(Nono, America), Owns(Nono, M1), and Missile(M1).

1. American (x) ∧ weapon(y) ∧ sells (x, y, z) ∧ hostile(z) ⇒


Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) ⇒ Sells (West, x, Nono)
5. Missile(x) ⇒ Weapons (x)
6. Enemy(x, America) ⇒ Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Step-2-Rules
• At the second step, see those facts which infer from available facts
and with satisfied premises.
• Rule-(1) does not satisfy premises, so it will not be added in the first iteration.
• Rule-(2) and (3) are already added.
• Rule-(4) satisfy with the substitution {x/M1}, so Sells (West, M1, Nono) is added,
which infers from the conjunction of Rule (2) and (3).
• Rule-(6) is satisfied with the substitution(x/Nono), so Hostile(Nono) is added and
which infers from Rule-(7).
1. American (x) ∧ weapon(y) ∧ sells (x, y, z) ∧ hostile(z) ⇒
Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) ⇒ Sells (West, x, Nono)
5. Missile(x) ⇒ Weapons (x)
6. Enemy(x, America) ⇒ Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Step-3 -Conclusion
• At step-3, Rule-(1) is satisfied with the substitution

• {x/West, y/M1, z/Nono}, so we can add Criminal(West) which infers all the available facts.
And hence we reached our goal statement 1. American (x) ∧ weapon(y) ∧ sells (x, y, z) ∧
hostile(z) ⇒ Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) ⇒ Sells (West, x,
Nono)
5. Missile(x) ⇒ Weapons (x)
6. Enemy(x, America) ⇒ Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Backward chaining
Backward chaining

• It is also known as a backward deduction or backward reasoning method


using an inference engine
• It starts from goal and proceeds backward to determine the set of rules
that match the goal
• Why did this happen?
• It is top down approach
• It is goal driven
• Example: diagnose bacterial infections
Backward chaining Example
• Example
• The street is wet --B
• If it is raining, the street is wet --A⇒B
• It is raining ---A
• Conclude from “B” and “A implies B” to “A”
B
A⇒ B
A
Define backward chaining algorithm
• A backward chaining algorithm is a form of reasoning, which starts with
the goal and works backward, chaining through rules to find known facts
that support the goal.

• It is known as a top-down approach.

• Backward-chaining is based on modus ponens inference rule.

• In backward chaining, the goal is broken into sub-goal or sub-goals to


prove the facts true.

• It is also called as goal-driven approach, as a list of goals decides which


rules are selected and used.
Backward chaining Algorithm
Backward Chaining Proof
• In Backward chaining we will start with our goal predicate which is
Criminal(West) and from the goal fact we will infer other facts and at last we will
prove those facts true
• So our goal fact is “West is Criminal”
• Step 1
• At the first step we will take the goal fact 1. American (x) ∧ weapon(y) ∧ sells
(x, y, z) ∧ hostile(z) → Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) →
Sells (West, x, Nono)
5. Missile(x) → Weapons (x)
6. Enemy(x, America) →Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Step 2
• At the second step we will infer other facts from goal fact which satisfies the
rules.so we can see in Rule -1 the goal predicate Criminal(west) is present with
substitution{West/x}.So we will add all the conjunctive facts below the first level
and will replace x with West. Here we can see American(West) is a fact so it is
proved here
1. American (x) ∧ weapon(y) ∧ sells
(x, y, z) ∧ hostile(z) → Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) →
Sells (West, x, Nono)
5. Missile(x) → Weapons (x)
6. Enemy(x, America) →Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Step 3
• At step-3 we will extract further fact Missile(y)which infer from weapon(y) as it
satisfies Rule-5.Weapon (y) is also true with the substitution of a constant M1
at y 1. American (x) ∧ weapon(y) ∧ sells
(x, y, z) ∧ hostile(z) → Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) →
Sells (West, x, Nono)
5. Missile(x) → Weapons (x)
6. Enemy(x, America) →Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Step 4
• At Step 4 we can infer facts Missile(M1) and Owns(M1) from sells(West,M1,z)
which satisfies the Rule -4 with the substitution of West in place of z.So these
two statements are proved here 1. American (x) ∧ weapon(y) ∧
sells (x, y, z) ∧ hostile(z) →
Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) →
Sells (West, x, Nono)
5. Missile(x) → Weapons (x)
6. Enemy(x, America) →Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Step 5
• At Step 5 we can infer the fact Enemy(Nono,America)
1. American (x) ∧ weapon(y) ∧
sells (x, y, z) ∧ hostile(z) →
from Hostile(Nono) which satisfies Rule-6. Criminal(x)
2. Owns(Nono, M1)
3. Missile(M1)
4. Missiles(x) ∧ Owns (Nono , x) →
Sells (West, x, Nono)
5. Missile(x) → Weapons (x)
6. Enemy(x, America) →Hostile(x)
7. Enemy (Nono, America)
8. American(West)
Forward vs. backward chaining

• FC is data-driven, automatic, unconscious processing,

• e.g., object recognition, routine decisions

• May do lots of work that is irrelevant to the goal

• BC is goal-driven, appropriate for problem-solving,

• e.g., Where are my keys? How do I get into a PhD program?

• Complexity of BC can be much less than linear in size of KB


Unit II

Logical Agents

Logical agents – Propositional logic – First order logic – syntax and semantics
of FOL–Using first order logic – knowledge engineering in FOL– Inference in
FOL– Unification and Lifting – Forward and backward chaining – Resolution.
Resolution
Resolution

• It is a theorem proving technique that proofs by contradiction or


refutation

• It is used ,if there are various statements are given and we need to prove
a conclusion of those statements

• Unification is a key concept in proof by resolution

• Resolution is a single inference rule which can efficiently operate on


conjunctive normal form (CNF)

• Clause: Disjunction of literals is called clause

• CNF:A sentence represented as a conjunction of clauses


Steps for Resolution

• Convert facts in to FOL

• Convert FOL statement in to CNF

• Negate the statement which need to be proved and add the result to the
knowledge base

• Draw resolution graph

• If empty clause is produced ,stop and report that original theorem is true
Example1

a. If it is sunny and warm day you will enjoy


b. If it is raining you will get wet
c. It is warm day
d. It is raining
e. It is sunny
Goal: You will enjoy
Prove: Enjoy
Example1: Step1 Conversion to FOL

If it is sunny and warm day you will enjoy


Sunny ∧warm⇒enjoy
If it is raining you will get wet
raining ⇒wet
It is warm day
warm
It is raining
raining
It is sunny
sunny
Step2 Conversion to CNF
• Sunny ∧warm⇒enjoy
Eliminate implication
(Sunny ∧ warm) ∨ enjoy
Moving negation inside
Sunny ∨  warm ∨ enjoy
• raining ⇒wet
Eliminate implication
 raining ∨ wet
• warm
• raining
• sunny
Step3&4 Resolution graph

• Negate the sentence to be proved:  enjoy

1. Sunny ∨  warm ∨
enjoy
2.  raining ∨ wet
3. warm
4. raining
5. sunny
Example 2:Step1 Conversion to FOL

• The humidity is high or the sky is • The humidity is high or the sky is cloudy
cloudy Let P:The humidity is high
Q: sky is cloudy
• If the sky is cloudy, then it will rain P∨Q
• If the humidity is high then it is hot • If the sky is cloudy, then it will rain

• It is not hot Let R: it will rain


Q⇒ R
• Goal : It will rain
• If the humidity is high then it is hot
Let S:it is hot
P⇒S
• It is not hot S
• Goal: It will rain
• Let R: It will rain
(R)
Step 2:Conversion to CNF

• P∨Q • P∨Q

• Q ⇒R • Q∨R

• P ⇒S • P∨S

• S • S

Negation of Goal (  R) :It will not Rain


Step 3:Resolution Graph

• P∨Q

• Q∨R

• P∨S

• S
• (  R) -Goal
Example 3 :Step 1:conversion of facts in to FOL

1. Sunil likes all kind of food 1. ∀x food(x)⇒likes(Sunil,x)


2. Food (apple)
2. Apple and vegetable are food
∧Food(Vegetable)
3. Anything anyone eats and not killed is
3. ∀x∀y eats(x,y) ∧ killed(x)
food
⇒Food(y)
4. Anil eats peanuts and still alive
4. Eats(Anil,peanuts) ∧alive(Anil)
5. Harry eats everything that Anil eats 5. ∀x eats(Anil,x) ⇒eats(Harry,x)
6. Sunil likes peanuts 6. Likes(Sunil,peanuts)
Prove by resolution that Added predicates

Sunil likes peanuts 7. ∀x killed(x) ⇒ alive(x)


8. ∀x alive(x) ⇒  killed(x)
Step 2:Conversion to CNF

1. ∀x food(x)⇒likes(Sunil)
1. ∀x  food(x) ∨ likes(Sunil,x)
2. Food (apple)
2. Food (apple) ∧Food(Vegetable)
∧Food(Vegetable)
3. ∀x∀y [eats(x,y) ∧ killed(x)] ∨
3. ∀x∀y eats(x,y) ∧ killed(x)
Food(y)
⇒Food(y)
4. Eats(Anil,peanuts) ∧alive(Anil)
4. Eats(Anil,peanuts) ∧alive(Anil)
5. ∀x  eats(Anil,x) ∨ eats(Harry,x)
5. ∀x eats(Anil,x) ⇒eats(Harry,x)
6. Likes(Sunil,peanuts)
6. Likes(Sunil,peanuts)
7. ∀x [ killed(x) ] ∨ alive(x)
7. ∀x killed(x) ⇒ alive(x)
8. ∀x  alive(x) ∨  killed(x)
8. ∀x alive(x) ⇒  killed(x)
Step 2:Move negation() inwards

1. ∀x  food(x) ∨ likes(Sunil,x)
1. ∀x  food(x) ∨ likes(Sunil,x)
2. Food (apple) ∧Food(Vegetable)
2. Food (apple) ∧Food(Vegetable)
3. ∀x∀y [eats(x,y) ∧ killed(x)] ∨
3. ∀x∀y eats(x,y) ∨ killed(x)] ∨
Food(y)
Food(y)
4. Eats(Anil,peanuts) ∧alive(Anil)
4. Eats(Anil,peanuts) ∧alive(Anil)
5. ∀x  eats(Anil,x) ∨ eats(Harry,x)
5. ∀x  eats(Anil,x) ∨ eats(Harry,x)
6. Likes(Sunil,peanuts)
6. Likes(Sunil,peanuts)
7. ∀x [ killed(x) ] ∨ alive(x)
7. ∀x killed(x) ∨ alive(x)
8. ∀x  alive(x) ∨  killed(x)
8. ∀x  alive(x) ∨  killed(x)
Step2: Rename Variables or standardize
variables

1. ∀x  food(x) ∨ likes(Sunil,x) 1. ∀x  food(x) ∨ likes(Sunil,x)


2. Food (apple) ∧Food(Vegetable) 2. Food (apple) ∧Food(Vegetable)
3. ∀x∀y eats(x,y) ∨ killed(x)] ∨ 3. ∀y∀z eats(y,z) ∨ killed(y)] ∨
Food(y) Food(z)
4. Eats(Anil,peanuts) ∧alive(Anil) 4. Eats(Anil,peanuts) ∧alive(Anil)
5. ∀x  eats(Anil,x) ∨ eats(Harry,x) 5. ∀w  eats(Anil,w) ∨
6. Likes(Sunil,peanuts) eats(Harry,w)
7. ∀x killed(x) ∨ alive(x) 6. Likes(Sunil,peanuts)
8. ∀x  alive(x) ∨  killed(x) 7. ∀g killed(g) ∨ alive(g)
8. ∀k  alive(k) ∨  killed(k)
Step2:Drop Universal quantifiers

1. ∀x  food(x) ∨ likes(Sunil,x) 1.  food(x) ∨ likes(Sunil,x)


2. Food (apple) ∧Food(Vegetable) 2. Food (apple)
3. ∀y∀z eats(y,z) ∨ killed(y)] ∨ 3. Food(Vegetable)
Food(z) 4. eats(y,z) ∨ killed(y)] ∨ Food(z)
4. Eats(Anil,peanuts) ∧alive(Anil) 5. Eats(Anil,peanuts)
5. ∀w  eats(Anil,w) ∨ 6. alive(Anil)
eats(Harry,w) 7.  eats(Anil,w) ∨ eats(Harry,w)
6. Likes(Sunil,peanuts) 8. Likes(Sunil,peanuts)
7. ∀g killed(g) ∨ alive(g) 9. killed(g) ∨ alive(g)
8. ∀k  alive(k) ∨  killed(k) 10.  alive(k) ∨  killed(k)
Step 3:Negate the statement to be proved

• Prove by Resolution that : Sunil Likes Peanuts


• Likes(Sunil,Peanuts)
• Likes(Sunil,Peanuts)
Step4:Resolution Graph
1.  food(x) ∨ likes(Sunil,x)
2. Food (apple)
3. Food(Vegetable)
4. eats(y,z) ∨ killed(y)] ∨
Food(z)
5. Eats(Anil,peanuts)
6. alive(Anil)
7.  eats(Anil,w) ∨ eats(Harry,w)
8. Likes(Sunil,peanuts)
9. killed(g) ∨ alive(g)
10.  alive(k) ∨  killed(k)
Example 4

The law says that it is a crime for an American to sell weapons to


hostile nations. The country Nono, an enemy of America, has some
missiles, and all of its missiles were sold to it by Colonel West, who
is American.
Prove that Col. West is a criminal
Convert Natural language sentence to FOL

It is a crime for an American to sell weapons to hostile nations


x ,y, z American(x) ˄ Weapon(y) ˄ Sells(x, y, z) ˄ Hostile(z) => Criminal(x)
Nono has some missiles
϶ x Owns(Nono, x) ^ Missile(x)
all of its missiles were sold to it by Colonel West
 x Missile(x) ^ Owns(Nono, x) => Sells(West, x, Nono)
Missiles are weapons
 x Missile(x) => Weapon(x)
An enemy of America counts as hostile
 x Enemy(x, America) => Hostile(x)
West, who is American
American(West)
The country Nono, an enemy of America
Enemy(Nono, America)
Convert FOL sentence in CNF

• ¬American(x) ∨ ¬Weapon(y) ∨ ¬Sells(x, y, z) ∨ ¬Hostile(z) ∨ Criminal(x)

• ¬Missile(x) ∨ ¬Owns(Nono, x) ∨ Sells(West, x, Nono)

• ¬Enemy(x,America) ∨ Hostile(x)

• ¬Missile(x) ∨ Weapon(x)

• Owns(Nono,M1)

• American(West)

• Missile(M1)

• Enemy(Nono, America)
Resolution: brief summary
• Algorithm works using proof by contradiction

• To show KB ╞ α we show that KB ∧  α is not satisfiable

• Apply resolution to KB ∧  α in CNF and resolve pairs with


complementary literals

l1  ···  lk, m1  ···  mn

(l1  ···  li-1  li+1  ···  lk  m1  ···  mj-1  mj+1  ···  mn)

• If li and mj are complementary literals and add new clauses until


• There are no new clauses to be added
• Two clauses resolve to the empty clause which means KB ╞ α
Reference(s):

T1. Stuart Russell, Peter Norvig, “Artificial Intelligence– A


modern Approach”, 3rd Edition,Pearson Education,2014..

You might also like