0% found this document useful (0 votes)
184 views41 pages

Topic For The Class:: Knowledge and Reasoning

The document discusses inference in first-order logic. It covers propositional vs first-order inference, inference rules for quantifiers including universal instantiation and existential instantiation. It also discusses reduction of first-order inference to propositional inference by applying universal instantiation to replace universally quantified sentences. The document further explains unification and lifting, including the generalized modus ponens inference rule and the unification algorithm.

Uploaded by

Vamshidhar Reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
184 views41 pages

Topic For The Class:: Knowledge and Reasoning

The document discusses inference in first-order logic. It covers propositional vs first-order inference, inference rules for quantifiers including universal instantiation and existential instantiation. It also discusses reduction of first-order inference to propositional inference by applying universal instantiation to replace universally quantified sentences. The document further explains unification and lifting, including the generalized modus ponens inference rule and the unification algorithm.

Uploaded by

Vamshidhar Reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 41

Topic for the class:

Module-V
KNOWLEDGE and REASONING
Chapter 9: Inference in First-Order Logic
Dr. K. Srinivasa Rao
Associate Professor
Department of CSE
GITAM Institute of Technology (GIT)
Visakhapatnam – 530045
Email: [email protected]
Mobile: 98486 73655
Department of CSE, GIT ECS302: AI 1
9.1 Propositional vs. First-Order Inference
Inference Rules for Quantifiers:
Suppose we have: ‘All greedy kings are evil’.
∀ x King(x) Ʌ Greedy(x)  Evil(x)

Then we can infer any of the following sentences:

Universal Instantiation (UI):

This rule says that we can infer any sentence obtained by substituting a ground term (a term without variables) for
the variable.

Let SUBST (θ, α) denote the result of applying the substitution θ to the sentence α.

Department of CSE, GIT ECS302: AI 2


Continued...
Then the rule is written:
for any variable ‘v’ and ground term ‘g’.

Ex: The three sentences given earlier (in slide 2) are obtained with the substitutions:
{ x / John }, { x / Richard }, { x / Father(John) }

Existential Instantiation:
For any sentence α, variable v and constant symbol k that does not appear elsewhere in KB,

For example, from the sentence Ǝ x Crown(x) Ʌ OnHead(x, John)


we can infer the sentence Crown(C1) Ʌ OnHead(C1, John)
as long as C1 does not appear elsewhere in the KB.

Department of CSE, GIT ECS302: AI 3


Continued...
Reduction to Propositional Inference:
 It becomes possible to reduce first-order inference to propositional inference.
 As an existentially quantified sentence can be replaced by one instantiation,
a universally quantified sentence can be replaced by the set of all possible instantiations.
Ex: Suppose the KB is:

We apply UI to the first sentence using all possible ground term substitutions from the vocabulary of the KB,
in this case, { x / John } and { x / Richard }.
We obtain:
King(John) Ʌ Greedy(John)  Evil(John)
King(Richard) Ʌ Greedy(Richard)  Evil(Richard)

and we discard the universally quantified sentence.

Department of CSE, GIT ECS302: AI 4


9.2 Unification and Lifting
Now the KB is essentially propositional if we view the ground atomic sentences -- King(John), Greedy(John) and so on
as propositional symbols.

Unification and Lifting:


 The inference of Evil(John) from the sentences
∀x King(x) Ʌ Greedy(x)  Evil(x)
King(John)
Greedy(John)
seems completely obvious to a human being.
 How to make it completely obvious to a computer?

A first-order inference rule:


For this to work, we find some x such that x is a king and x is greedy.
In this case, the substitution { x / John } achieves that aim.

Department of CSE, GIT ECS302: AI 5


Continued...
Suppose that instead of knowing Greedy(John), we know that everyone is greedy.
∀y Greedy(y)
Then, we would still be able to conclude that: Evil(John).
(because we know that John is a king (given) and John is greedy (because everyone is greedy).
i.e., the substitution is { x / John, y / John }
This inference process can be captured as a single inference rule that we call Generalized Modus Ponens.
Generalized Modus Ponens:
For atomic sentences pi , pi ‘ and q, where there is a substitution θ such that
SUBST(θ, pi ‘) = SUBST(θ, pi ) for all i,

There are n atomic sentences pi ‘ and one implication to this rule.

Department of CSE, GIT ECS302: AI 6


Continued...
The conclusion is the result of applying the substitution θ to the consequent q.
For our example:
p1 ‘ is King(John) p1 is King(x)
p2‘ is Greedy(y) p2 is Greedy(x)
θ is { x / John, y / John } q is Evil(x)
SUBST(θ , q) is Evil(John)

Generalized Modus Ponens is a lifted version of Modus Ponens.

Unification:
 It is a process of finding substitutions that make different logical expressions look identical.
 It takes two sentences and returns a unifier for them, if one exists.

UNIFY (p, q) = θ where SUBST (θ, p) = SUBST (θ, q).

Department of CSE, GIT ECS302: AI 7


Continued...
Exs:
UNIFY (Knows(John, x), Knows(John, Jane)) = { x / Jane }
UNIFY (Knows(John, x), Knows(y, Bill)) = { x / Bill, y / John}
UNIFY (Knows(John, x), Knows(y, Mother(y)) = { y / John, x / Mother(John) }
UNIFY (Knows(John, x), Knows(x, Elizabeth)) = FAIL

In the last one, the problem arises because of the usage of the same variable x in both the sentences.

If we standardize apart one of the two sentences (that is, renaming its variables to avoid clashes),

then we have: UNIFY (Knows(John, x), Knows(z, Elizabeth)).

So, the substitution is { x / Elizabeth, z / John }.

Department of CSE, GIT ECS302: AI 8


Continued...
Most General Unifier (MGU):
There could be more than one unifier in some cases.
Ex: UNIFY (Knows(John, x), Knows(y, z))
could return { y / John, x / z } or { y / John, x / John, z / John }.

 The first unifier gives Knows(John, z),


 The second unifier gives Knows(John, John).
-- could be obtained from the first by an additional substitution { z / John }.

 We say that the first unifier is more general than the second,
because it places fewer restrictions on the values of the variables.

 For every unifiable pair of expressions, there is a single Most General Unifier (MGU).
In this case, it is { y / John, x / z }.

An algorithm for computing MGUs is follows.


Department of CSE, GIT ECS302: AI 9
Continued...
Algorithm:

Department of CSE, GIT ECS302: AI 10


Continued...
Explanation:
1. If x and y are both variables or constants, if x and y are identical, then return NIL.
θ is empty or θ is as it is.
Ex: King(John), King(John) or King(x), King(x)

2. If x is a variable and y is a constant, then return constant for variable


Ex: King(x), King(John) return { x / John }

3. If x is a constant and y is a variable, then return constant for variable


Ex: Evil(Richard), Evil(x) return { x / Richard }

4. If x and y are COMPOUND expressions, call UNIFY with


first arguments of x and y,
second arguments of x and y,
……………….
nth arguments of x and y.
Department of CSE, GIT ECS302: AI 11
Continued...
Ex: Father(m, David, Bill) (m is the father of David and Bill)
(args[x] : m, David and Bill)
Father(Taylor, p, Bill) (Taylor is the father of p and Bill)
(args[y] : Taylor, p and Bill)
θ is { m / Taylor, p / David, NIL }

5. If x is a list and y is a list, then


UNIFY First elements of x and y
and then Rest of the elements of x and y.
Ex: [m, David, Bill] [Taylor, p, Bill]
first rest first rest
θ is { m / Taylor, p / David, NIL }

8. OCCUR-CHECK (if a variable itself occurs inside a complex term)


Ex: F(x, x), F(G(x), G(x))
If G(x) is written for x, we have: F(G(x), G(x)), F(G(G(x)), G(G(x))) -- never possible to eliminate x
Department of CSE, GIT ECS302: AI 12
Continued...
Algorithm: Unify(L1, L2) (From: AI – Elaine Rich & Kevin Knight)
1. If L1 or L2 are both variables or constants, then
(a) If L1 and L2 are identical, then return NIL.
(b) Else if L1 is a variable, then if L1 occurs in L2 then return { FAIL }, else return (L2 / L1) (return L2 for L1)
(c) Else if L2 is a variable, then if L2 occurs in L1 then return { FAIL }, else return (L1 / L2) (return L1 for L2)
(d) Else return FAIL.
2. If the initial predicate symbols in L1 and L2 are not identical, then return { FAIL }.
3. If L1 and L2 have a different number of arguments, then return { FAIL }.
4. Set SUBST to NIL. (At the end of the procedure, SUBST will contain all the substitutions used to unify L1 and L2.)
5. For i 1 to number of arguments in L1:
(a) Call Unify with the ith argument of L1 and ith argument of L2, putting result in S.
(b) If S contains FAIL, then return { FAIL }.
(c) If S is not equal to NIL, then
(i) Apply S to the remainder of both L1 and L2.
(ii) SUBST := APPEND(S, SUBST)
6. Return SUBST.
Department of CSE, GIT ECS302: AI 13
Continued...
Explanation:
Step Algorithm returns
1.(a) Teacher(x), ¬Teacher(x) or NIL
Teacher(Pradeep), ¬Teacher(Pradeep)

1.(b) f(x, x), f(g(x), g(x)) FAIL


(if we write g(x) for x, then it will be: f(g(x), g(x)), f(g(g(x)), g(g(x))),
it will never be possible to eliminate x (OCCUR_CHECK))
f(x), f(John) { x / John }
f(x), f(g(y)) { x / g(y) }

1.(c) f(g(x)), f(x) FAIL (OCCUR_CHECK, same as above)


f(John), f(z) { z / John }

1.(d) Return FAIL.

Department of CSE, GIT ECS302: AI 14


Continued...

Step Algorithm returns


2. Man(John), Teacher(Bob) FAIL
3. StudentOf(Lalitha, Btech), Student(Lalitha) FAIL
4. SUBST = NIL
5. FounderOf(BillGates, Microsoft), FounderOf(x, y) S = { x / BillGates }
S = { x / BillGates, y / Microsoft }
6. Return SUBST.

Storage and Retrieval:


(primitive functions underlying TELL and ASK)

STORE (s) -- stores a sentence s into the KB


FETCH(q) -- returns all unifiers such that the query q unifies with some sentence in the KB
Ex: Knows(John, x) -- a query, an instance of fetching
(finds all facts that unify with Knows(John, x)

Department of CSE, GIT ECS302: AI 15


9.3 Forward Chaining

(i) First-Order Definite Clauses:


 A definite clause either is atomic or is an implication whose antecedent is a conjunction of positive literals and
whose consequent is a single positive literal.
Ex: ∀x King(x) Ʌ Greedy(x)  Evil(x)
King(John)
Greedy(y)
 Unlike propositional literals, first-order literals can include variables in which case, those variables are assumed to
be universally quantified. (usually, we omit them)
 Many of the KBs can be converted into a set of definite clauses.

Example Problem:

‘The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of
America, has some missiles, and all of its missiles were sold to it by Colonel West, who is American.’

We will prove that ‘West is a Criminal’.

Department of CSE, GIT ECS302: AI 16


Continued...

Sentences in FOL:
1. It is a crime for an American to sell weapons to hostile nations.
American(x) Ʌ Weapon(y) Ʌ Sells(x, y, z) Ʌ Hostile(z)  Criminal(x)

 Nono has some missiles.


Ǝx Owns(Nono, x) Ʌ Missile(x) is transformed into two definite clauses.
2. Owns(Nono, M1 )
3. Missile(M1)

4. All of its missiles were sold to it by Colonel West.


Missile(x) Ʌ Owns(Nono, x)  Sells(West, x, Nono)

5. Missiles are weapons.


Missile(x)  Weapon(x)

Department of CSE, GIT ECS302: AI 17


Continued...

6. An enemy of America counts as “hostile”.


Enemy(x, America)  Hostile(x)

7. West is an American.
American(West)

8. The country Nono is an enemy of America.


Enemy(Nono, America)

Forward chaining considers atomic sentences and tries to satisfy the premises of rules.
This leads to inference of new sentences and thereby proof of a goal.

To prove ‘West is Criminal’, all of the premises of 1 have to be satisfied.

Department of CSE, GIT ECS302: AI 18


Continued...

Use 3 in 4 and infer:


9. Sells(West, M1, Nono)

Use 3 in 5 and infer:


10. Weapon(M1)

Use 8 in 6 and infer:


11. Hostile(Nono)

Now use 7, 10, 9 and 11 in 1 and infer:


12. Criminal(West)

Hence, ‘West is Criminal’ is proved.


The substitutions are: { x / West, y / M1 , z / Nono }

Department of CSE, GIT ECS302: AI 19


Continued...

(ii) A simple Forward Chaining Algorithm:

Department of CSE, GIT ECS302: AI 20


Continued...

Explanation:
 Starting from the known facts, it triggers all the rules whose premises are satisfied, adding their conclusions to
the known facts.
 The process repeats until the query is answered or new facts are added.

A fact is not new if it is just renaming of a known fact.


Ex: Likes(x, IceCream) and Likes(y, IceCream) are renaming of each other.
Their meanings are identical (everyone likes ice cream).

FOL-FC-ASK is:
 Sound, because every inference is just an application of Generalized Modus Ponens, which is sound.

 Complete for definite clause KBs; i.e., it answers every query whose answers are entailed by any KB of definite
clauses.

Department of CSE, GIT ECS302: AI 21


Continued...

The Proof Tree:

 Forward chaining -- from bottom to top in the diagram


 Bottom level -- initial facts
 Middle level -- facts inferred on the first iteration
 Top level -- facts inferred on the second iteration
Department of CSE, GIT ECS302: AI 22
9.4 Backward Chaining

Algorithm:

Department of CSE, GIT ECS302: AI 23


9.4 Backward Chaining

Explanation:

 The algorithm uses compositions of substitutions.

 Compose(θ1, θ2) is the substitution whose effect is identical to the effect of applying each substitution in turn.
 That is,
SUBST(Compose(θ1, θ2) , p) = SUBST(θ2, SUBST(θ1, p))

 In the algorithm, the current variable bindings, which are stored in θ, are composed with the bindings resulting
from unifying the goal with the clause head, giving a new set of current bindings for the recursive call.

Department of CSE, GIT ECS302: AI 24


Continued...

Proof Tree: To prove that ‘West is Criminal’.

Department of CSE, GIT ECS302: AI 25


Continued...

 Backward chaining -- from top to bottom in the diagram


 Middle and Bottom level -- read from left to right
 To prove ‘Criminal(West)’ -- prove the conjuncts (four) in the middle level
 To prove any middle-level goal -- go down to its next lower level
General Examples for Forward and Backward Chaining:
Consider: Forward Chaining (5) (2) (combine A and use 4)
1. A Ʌ B Ʌ C  Z A (given, proved)  Y (given, proved)  M  K 
2. M  K A Ʌ K  B (proved)  C (given, proved)  Z (proved)
3. A (4) Consider 7 from 1
4. A Ʌ K  B Backward Chaining
5. Y  M (1) (3) (7) (4) (3) (2) (5)
6. Y Z  A Ʌ B Ʌ C  B Ʌ C  B  A Ʌ K  K  M  Y  Y proved from 6.
7. C
Prove Z.

Department of CSE, GIT ECS302: AI 26


9.5 Resolution

Conjunctive Normal Form (CNF):


As in the propositional case, first-order resolution requires that sentences be in CNF -- that is a conjunction of
clauses, where each clause is a disjunction of literals.
Literals can contain variables, which are assumed to be universally quantified.

becomes in CNF,

Every sentence of first-order logic can be converted into an inferentailly equivalent CNF sentence.
Procedure:
Ex: Everyone who loves all animals is loved by someone.

Department of CSE, GIT ECS302: AI 27


Continued...

Steps:
1. Eliminate Implications
∀x [∀y ¬Animal(y) V Loves(x, y)]  [Ǝy Loves(y, x)]
∀x [¬∀y ¬Animal(y) V Loves(x, y)] V [Ǝy Loves(y, x)]

2. Move ¬ inwards

We have

Our sentence goes through the following transformations:

Department of CSE, GIT ECS302: AI 28


Continued...

3. Standardize variables:
Sentences like (∀x P(x)) V (Ǝx Q(x)) can be written as (∀x P(x)) V (Ǝy Q(y))
Thus we have,

4. Skolemize:
Skolemization is the process of removing existential quantifiers by elimination.
Ǝx P(x) can be written as P(A) where A is a new constant.
So, our sentence becomes:
which has the wrong meaning entirely: it says that –
‘everyone either fails to love a particular animal A or is loved by some particular entity B’.

In fact, our original sentence allows –


‘each person to fail to love a different animal or to be loved by a different person’.

Department of CSE, GIT ECS302: AI 29


Continued...

Thus, we want the Skolem entities to depend on x. A better way is:

∀x
( F and G are Skolem Functions)
General Rule: The arguments of the Skolem function are all the universally quantified variables in whose scope
the existential quantifier appears.
5. Drop Universal quantifiers:
At this point, all remaining variables must be universally quantified. We can drop the universal quantifiers.

6. Distribute Ʌ over V:

The sentence is now in CNF and consists of two clauses.

Department of CSE, GIT ECS302: AI 30


Continued...

The Resolution Inference Rule:

We have

where UNIFY(li , ¬mj )= θ .

Ex: Consider

Now the unifier is : θ = { u / G(x) , v / x }

and the resolvent is :

Department of CSE, GIT ECS302: AI 31


Continued...

The sentences (in Slides 17 & 18 ) :

1. American(x) Ʌ Weapon(y) Ʌ Sells(x, y, z) Ʌ Hostile(z)  Criminal(x)


2. Owns(Nono, M1 )
3. Missile(M1)
4. Missile(x) Ʌ Owns(Nono, x)  Sells(West, x, Nono)
5. Missile(x)  Weapon(x)
6. Enemy(x, America)  Hostile(x)
7. American(West)
8. Enemy(Nono, America)

Department of CSE, GIT ECS302: AI 32


Continued...

The sentences in Clause Form:

1.

2. Owns(Nono, M1 )
3. Missile(M1)

4.

5.
6.

7. American(West)
8. Enemy(Nono, America)
9. ¬Criminal(West) (Goal is negated [refutation], converted to a clause and added)

Department of CSE, GIT ECS302: AI 33


Continued...

Proof:
¬Criminal(West) 1
x / West

7 ¬American(West) V ¬Weapon(y) V ¬Sells(West, y, z) V ¬Hostile(z)


Nil

¬Weapon(y) V ¬Sells(West, y, z) V ¬Hostile(z) 5


x/y

3 ¬Missile(y) V ¬Sells(West, y, z) V ¬Hostile(z)


y / M1

¬Sells(West, M1, z) V ¬Hostile(z)

Department of CSE, GIT ECS302: AI 34


Continued...

Proof Continued:
¬Sells(West, M1, z) V ¬Hostile(z) 4
x / M1 , z / Nono

3 ¬Missile(M1) V ¬Owns(Nono, M1 ) V ¬Hostile(Nono)


Nil

¬Owns(Nono, M1 ) V ¬Hostile(Nono) 2
Nil

¬Hostile(Nono) 6
x / Nono
8 ¬Enemy(Nono, America)

Contradiction
Hence, ‘West is Criminal’ is proved.
Department of CSE, GIT ECS302: AI 35
Continued...

Problem 1: From Exercise of E. Rich and Kevin Knight


Consider the following sentences:
 John likes all kinds of food.

 Apples are food.

 Chicken is food.

 Anything anyone eats and isn’t killed by is food.

 Bill eats peanuts and is still alive.

 Sue eats everything Bill eats.

Department of CSE, GIT ECS302: AI 36


Continued...

(a) Translate these sentences into formulas in FOL (Predicate logic):

1. ∀x1 Food(x1)  Likes(John, x1)

2. Food(Apples)

3. Food(Chicken)

4. ∀x2, y Eats(x2, y) Ʌ Alive(x2)  Food(y)

5. Eats(Bill, Peanuts) Ʌ Alive(Bill)

6. ∀x3 Eats(Bill, x3)  Eats(Sue, x3)

Department of CSE, GIT ECS302: AI 37


Continued...

(b) Prove that ‘John likes peanuts’ using backward chaining


Likes(John, Peanuts)
(1, substitution)
Food(Peanuts)
(4, substitution)
Eats(x2, Peanuts) Ʌ Alive(x2)
(5, substitution)
Nil

Department of CSE, GIT ECS302: AI 38


Continued...

(c) Convert the formulas of part (a) into Clause form


1. ¬Food(x1) V likes(John, x1)

2. Food(Apples)

3. Food(Chicken)

4. ¬Eats(x2, y) V ¬Alive(x2) V Food(y)

5. Eats(Bill, Peanuts)

6. Alive(Bill)

7. ¬Eats(Bill, x3) V Eats(Sue, x3)

Department of CSE, GIT ECS302: AI 39


Continued...

(d) Prove that ‘John likes peanuts’ using resolution.


¬Likes(John, Peanuts) 1
x1 / Peanuts
4 ¬Food(Peanuts)
y / Peanuts
¬Eats(x2, Peanuts) V ¬Alive(x2) 5
x2 / Bill
6 ¬Alive(Bill)
Nil
Contradiction

So, ‘John likes peanuts’ is proved.

Department of CSE, GIT ECS302: AI 40


Continued...

Problem 2:
Given the following information for a database:
1. If x is on top of y, y supports x.
2. If x is above y and they are touching each other, x is on top of y.
3. A cup is above a book.
4. A cup is touching a book.
Translate these statements into clausal form. Show that ‘Supports(Book, Cup)’ is true using resolution.
Problem 3:
Represent the following facts in first-order logic and convert them into clause form.
Use resolution to find that ‘Ravi is the spy’.
5. One of Raman, Ravi, Raghu and Ramesh is the spy.
6. Raman is not the spy.
7. Spies wear light coloured dresses and do not attract attention of others.
8. Raghu was wearing a dark coloured suit.
9. Ramesh was the centre of attention on that evening.
****
Department of CSE, GIT ECS302: AI 41

You might also like