0% found this document useful (0 votes)
28 views57 pages

W Chapter9

AI papers

Uploaded by

khatermaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views57 pages

W Chapter9

AI papers

Uploaded by

khatermaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Chapter 9

Inference in
First-Order Logic
CS361 Artificial Intelligence
Dr. Khaled Wassif
Spring 2024

(This is the instructor’s notes, and the student must


read the textbook for complete material.)
Chapter Outline
◼ Inference Rules for Quantifiers
◼ Reducing First-Order Inference to Propositional
Inference
◼ Unification Inference Rules
◼ Forward Chaining and Its Applications
◼ Backward Chaining and Logic Programming
Systems
◼ Resolution-based Theorem Proving
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 2
By Dr. Khaled Wassif
Necessary Algorithms
◼ We already know enough to implement TELL
(although maybe not efficiently)
◼ But how do we implement ASK?
◼ Recall 3 cases
– Direct matching
– Finding a proof (inference)
– Finding a set of bindings (unification)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 3


By Dr. Khaled Wassif
Inference with Quantifiers
◼ Universal Instantiation:
– Given x Person(x)  Likes(x, McDonalds)
– Infer Person(John)  Likes(John, McDonalds)

◼ Existential Instantiation:
– Given x, Likes(x, McDonalds)
– Infer Likes(S1, McDonalds)
– S1 is a “Skolem Constant” that is not found anywhere else
in the KB and refers to (one of) the individuals that likes
McDonalds.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 4
By Dr. Khaled Wassif
Universal Instantiation (UI)
◼ Every instantiation of a universally quantified sentence
is inferred by:

– Infer any sentence a by substituting any variable v by a


ground term g.
» Each ground term is a term with out variables
◼ Example:
– x King(x)  Greedy(x)  Evil(x) yields:
» King(John)  Greedy(John)  Evil(John)
» King(Richard)  Greedy(Richard)  Evil(Richard)
» King(Father(John))  Greedy(Father(John)  Evil(Father(John))
» …
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 5
By Dr. Khaled Wassif
Existential Instantiation (EI)
◼ Every instantiation of an existentially quantified sentence
is inferred by:

– Infer any sentence a by substituting any variable v by a


constant k that does not appear elsewhere in the KB.
◼ Example:
– x Crown(x)  OnHead(x, John) yields:
» Crown(C1)  OnHead(C1, John)
» provided C1 is a new Skolem constant

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 6


By Dr. Khaled Wassif
Inference with Quantifiers
◼ UI can be applied several times to add many
new consequence sentences
– The new KB is logically equivalent to the old
◼ EI can be applied once to replace the
existential sentence
– The new KB is not logically equivalent to the old
– But the new KB is satisfyiable iff the old KB was
satisfiable (inferentially equivalent)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 7


By Dr. Khaled Wassif
Reduction to Propositional Inference
◼ Once inferring non-quantified sentences from quantified
sentences, it becomes possible to reduce first-order
inference to propositional inference.
– Use instantiation rules to create a relevant KB contains
propositional sentences.
– Then use the propositional reasoning with the created KB.
◼ Problems:
– When the knowledge base includes a function symbol, the
set of possible ground-term substitutions is infinite!
– May generate many irrelevant unproven propositional
sentences along the way!
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 8
By Dr. Khaled Wassif
Reduction to Propositional Inference
◼ Suppose the KB had the following sentence:
x King(x)  Greedy(x)  Evil(x)
King(John)
Greedy(John)
Brother(Richard, John)
◼ Instantiating the universal sentence in all possible ways…
King(John)  Greedy(John)  Evil(John)
King(Richard)  Greedy(Richard)  Evil(Richard)
King(John)
Greedy(John)
Brother(Richard, John)
◼ The new KB is propositionalized:
– Propositional symbols are King(John), Greedy(John), Evil(John),
King(Richard), etc…
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 9
By Dr. Khaled Wassif
Problems with Propositionalization
◼ Propositionalization tends to generate lots of irrelevant
sentences.
◼ Example:
x King(x)  Greedy(x)  Evil(x)
King(John)
y Greedy(y)
Brother(Richard, John)
– Obvious that Evil(John) is true, but the fact Greedy(Richard)
is irrelevant.
◼ With p k-ary predicates and n constants, there are
p X nk instantiations!!!
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 10
By Dr. Khaled Wassif
Unification
◼ Unification is the process of finding all legal
substitutions that make different logical expressions
look identical.
– A key component of all first-order inference algorithms.
– The UNIFY algorithm takes two sentences and returns a
unifier for them if one exists:
UNIFY(p, q) = θ where SUBST(θ, p) = SUBST(θ, q)
– Example: answer the query, Knows(John, x)?
p q θ
Knows(John, x) Knows(John, Jane) {x/Jane}
Knows(John, x) Knows(y, Bill) {x/Bill, y/John}
Knows(John, x) Knows(y, Mother(y)) {y/John, x/Mother(John)}
Knows(John, x) Knows(xx, Elizabeth)
Knows(x, Elizabeth) fail
{xx/John, x/Elizabeth}
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 11
By Dr. Khaled Wassif
Unification
◼ Reason of failing the last sentence:
UNIFY(Knows(John, x), Knows(x, Elizabeth)) = fail
– Because x cannot take on two values at the same time
– But “Everyone knows Elizabeth” and it should not fail!!
– Must standardize apart one of the two sentences to
eliminate reuse of the same variable.
◼ Now, we can get the inference immediately if finding
a substitution θ:
– Such that King(x) and Greedy(x)
– Match with King(John) and Greedy(y)
– By the substitution θ = {x/John, y/John}
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 12
By Dr. Khaled Wassif
Unification

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 13


By Dr. Khaled Wassif
Generalized Modus Ponens
◼ A general version of modus ponens inference rule for
first-order logic that does not require instantiation:
– For atomic sentences pi, pi’, and q, where there is a
substitution θ such that SUBST(θ, pi’) = SUBST(θ, pi), for all i:

» This rule contains n+1 premises: n of pi’ and one implication.


» The conclusion is the result of applying the substitution θ to the
consequent q.
– Example:
» King(John), y Greedy(y), x King(x)  Greedy(x)  Evil(x)
» Conclude: Evil(John) by the substitution {x/John, y/John}
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 14
By Dr. Khaled Wassif
Generalized Modus Ponens
◼ In CS terms:
– Given a rule containing variables.
– If there is a consistent set of bindings for all of the variables
of the left side of the rule (before the arrow).
– Then you can derive the result of substituting all of the same
variable bindings into the right side of the rule.
◼ Example:
– Given:
» x, y, z Parent(x, y)  Parent(y, z)  GrandParent(x, z)
» Parent(James, John), Parent(James, Richard), Parent(Harry, James)
– We can derive:
» GrandParent(Harry, John) by bindings: {x/Harry, y/James, z/John}
» GrandParent(Harry, Richard) bindings: {x/Harry, y/James, z/Richard}
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 15
By Dr. Khaled Wassif
First-order Definite Clauses
◼ Closely resemble propositional definite clauses.
◼ Either an atomic or is an implication whose premise is
a conjunction of positive literals and whose conclusion
is a single positive literal.
– The following are first-order definite clauses:
King(x) ∧ Greedy(x) ⇒ Evil(x) King(John) Greedy(y)
◼ Unlike propositional literals, first-order literals can
include variables, in which case those variables are
assumed to be universally quantified.
– We omit universal quantifiers when writing definite clauses.
◼ Not every knowledge base can be converted into a set
of definite clauses because of the single-positive-literal
restriction, but many can.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 16
By Dr. Khaled Wassif
Example Knowledge Base
The law says that it is a crime for an
American to sell weapons to hostile nations.
The country Nono, an enemy of America,
has some missiles, and all of its missiles
were sold to it by Colonel West, who is
American.

◼ Prove that:
Colonel West is a criminal
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 17
By Dr. Khaled Wassif
Example Knowledge Base
… it is a crime for an American to sell weapons to hostile nations:
American(x) ∧ Weapon(y) ∧ Hostile(z) ∧ Sells(x, y, z) ⇒ Criminal(x)
… The country Nono, an enemy of America: Enemy(Nono, America)
… An enemy of America counts as “hostile”:
Enemy(x, America) ⇒ Hostile(x)
… Nono … has some missiles: x
Missile(M 1) 
Missile(x) Owns(Nono,
Owns(Nono,Mx)
1)

… Missiles are weapons: Missile(x) ⇒ Weapon(x)


… all of its missiles were sold to it by Colonel West:
Missile(x) ∧ Owns(Nono, x) ⇒ Sells(West, x, Nono)
… West, who is American: American(West)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 18


By Dr. Khaled Wassif
Answering Questions in FOL
◼ More or less anything can be stated in first-order logic.
◼ It is important to have algorithms that can answer any
answerable question stated in first-order logic.
◼ Three major families of first-order inference algorithms:
– Forward chaining and its applications to deductive databases
and production systems
– Backward chaining and logic programming systems
– Resolution-based theorem-proving systems

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 19


By Dr. Khaled Wassif
Forward Chaining
◼ Data driven
◼ Main idea:
– Start with atomic sentences (facts) in the knowledge base.
– Apply Modus Ponens in the forward direction, by triggering
all rules whose premises are satisfied.
– Adding conclusions of the satisfied rules to the known facts.
– Repeat the process until the query is answered (assuming
that just one answer is required) or no new facts are added.
◼ Applied efficiently to first-order definite clauses.
◼ Especially useful for systems that make inference in
response to newly arrived information.
◼ With reasoning can be more efficient than resolution
theorem proving.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 20
By Dr. Khaled Wassif
Forward Chaining

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 21


By Dr. Khaled Wassif
FC: Example Knowledge Base

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 22


By Dr. Khaled Wassif
FC: Example Knowledge Base

Missile(x) ∧ Owns(Nono, x) ⇒ Sells(West, x, Nono)

Missile(x) ⇒ Weapon(x) Enemy(x, America)


⇒ Hostile(x)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 23


By Dr. Khaled Wassif
FC: Example Knowledge Base

American(x) ∧ Weapon(y) ∧ Hostile(z)


∧ Sells(x, y, z) ⇒ Criminal(x)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 24


By Dr. Khaled Wassif
Forward Chaining
◼ Properties:
– Sound because every inference is just done by applying Generalized
Modus Ponens.
– Complete for first-order definite clauses.
– Terminate for Datalog KB in finite number of iterations.
» Datalog = first-order definite clauses + without functions
– May not terminate in general if desired fact α is not entailed.
» This is unavoidable: entailment with definite clauses is semidecidable.
◼ There are three possible sources of inefficiency:
– Pattern matching can be very expensive.
» Finding all possible matching between rules premises and a suitable
set of facts in the knowledge base.
– Recheck every rule on every iteration to see whether its premises
are satisfied, even if very few additions are made on each iteration.
– May generate many facts that are irrelevant to the goal.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 25
By Dr. Khaled Wassif
Efficient Forward Chaining
◼ Expensiveness of matching rules against known facts
– Indexed knowledge base allows O(1) for retrieval of facts.
» e.g., query Missile(x) retrieves Missile(M1)
– Appropriate conjuncts ordering
» Find an ordering to solve the conjuncts of the rule premise so that the
total cost is minimized.
» Starting by most constrained (minimum values) variables is a good
heuristic.
» e.g., Missile(x) ∧ Owns(Nono, x) ⇒ Sells(West, x, Nono)
◼ Redundant rule matching
– New fact should depend on at least one newly generated fact.
– Incremental forward chaining: no need to check a rule on
iteration k if its premises wasn't added on iteration k-1.
» Match a rule whose its premise contains a newly added +ve literal.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 26
By Dr. Khaled Wassif
Efficient Forward Chaining
◼ Irrelevant facts
– One way is to use backward chaining
– Another solution is to restrict forward chaining to a selected
subset of rules.
– A third approach emerged in the field of deductive databases
» Large-scale databases, like relational databases, but use forward
chaining as the standard inference tool rather than SQL queries.
» By rewrite the rule set, using goal information, so that only relevant
variable bindings (called magic set) are considered during inference.
» e.g., if the goal is Criminal(West):
◼ The rule that concludes Criminal(x) will be rewritten to include an extra
conjunct that constrains the value of x:
Magic(x) ∧ American(x) ∧ Weapon(y) ∧ Sells(x, y, z) ∧ Hostile(z) ⇒ Criminal(x)
◼ The fact Magic(West) is also added to the KB.
» Basic idea: perform a sort of “generic” backward inference from the
goal in order to work out which variable bindings need to be constrained.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 27
By Dr. Khaled Wassif
Backward Chaining
◼ Goal driven
◼ Main idea:
– Consider the item to be proven as a goal.
– Find a rule whose its head is the goal and bindings with it.
– Apply bindings to the body of that rule and try to prove these
body as subgoals in turn.
– If you prove all the subgoals and increasing the binding set as
you go, then you will prove the goal item.
◼ Properties:
– Depth-first recursive proof search: space is linear in size of proof
– Incomplete due to infinite loops
– Inefficient due to repeated subgoals
◼ Widely used with logic programming (Prolog).
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 28
By Dr. Khaled Wassif
Backward Chaining

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 29


By Dr. Khaled Wassif
BC: Example Knowledge Base

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 30


By Dr. Khaled Wassif
BC: Example Knowledge Base

American(x) ∧ Weapon(y) ∧ Hostile(z)


∧ Sells(x, y, z) ⇒ Criminal(x)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 31


By Dr. Khaled Wassif
BC: Example Knowledge Base

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 32


By Dr. Khaled Wassif
BC: Example Knowledge Base

Missile(x) ⇒ Weapon(x)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 33


By Dr. Khaled Wassif
BC: Example Knowledge Base

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 34


By Dr. Khaled Wassif
BC: Example Knowledge Base

Missile(x) ∧ Owns(Nono, x) ⇒ Sells(West, x, Nono)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 35


By Dr. Khaled Wassif
BC: Example Knowledge Base

Enemy(x, America)
⇒ Hostile(x)

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 36


By Dr. Khaled Wassif
Logic Programming
◼ Declarative: computation as inference on logical KBs.
◼ Logic Programming ◼ Ordinary Programming
– Identify problem – Identify problem
– Assemble information – Assemble information
– Tea Break – Figure out solution
– Encode information in KB – Program Solution
– Encode problem instance – Encode problem instance
as facts as data
– Ask queries – Apply program to data
– Find false facts – Debug procedural errors
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 37
By Dr. Khaled Wassif
Logic Programming: Prolog
◼ Basis: backward chaining with Horn clauses + lots of bells
and whistles
– Widely used in Europe and Japan
– Basis of 5th Generation Languages and Projects
◼ Program = set of clauses = head :- literal1, literal2, … , literaln
criminal(X) :- american(X), weapon(Y), sells(X, Y, Z), hostile(Z).
◼ Efficient unification by open coding
◼ Efficient retrieval of matching clauses by direct linking
◼ Depth-first, left-to-right backward chaining
◼ Built-in predicates for arithmetic etc., e.g., X is Y * Z + 3
◼ Closed-world assumption (“negation as failure”)
e.g., Given alive(X) :- not dead(X).
alive(joe) succeeds if dead(joe) fails
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 38
By Dr. Khaled Wassif
Logic Programming: Prolog
◼ Example: Appending two lists to produce a third one.
append([], Y, Y).
append([X|L], Y, [X|Z]) :- append( L, Y, Z).
– query: append( [1, 2], [3, 4, 5], NL)?.
– answer:
» NL = [1, 2, 3, 4, 5]
– query: append( [1, 2], L, [1, 2, 3, 4, 5])?.
– answer:
» L = [3, 4, 5]
– query: append( A, B, [1,2])?.
– answers:
» A = [] B = [1,2];
» A = [1] B = [2];
» A = [1,2] B = []
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 39
By Dr. Khaled Wassif
Inference is Expensive!
◼ You start with:
– A large collection of facts (predicates) in the knowledge base.
– A large collection of possible transformations (rules).
◼ Some of these rules apply to:
– A single fact to yield a new fact.
– A pair of facts to yield a new fact.
◼ So at every step you must:
– Choose some rule to apply
– Choose one or two facts to which you may apply the rule
– If there are n facts in the knowledge base
» There are n potential ways to apply a single-operand rule
» There are n * (n - 1) potential ways to apply a two-operand rule
– Add the new fact that ever-expanding the knowledge base.
◼ The search space is huge!
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 40
By Dr. Khaled Wassif
The Magic of Resolution
◼ Here’s how resolution works:
– Transform each of your facts into a particular form, called
a clause.
– Apply a single rule, the resolution principle, to a pair of
clauses.
» Clauses are closed with respect to resolution – that is, when
resolve two clauses, get a new clause.
» Add the new clause to the knowledge base.
◼ So the number of facts you have grows linearly:
– You still have to choose a pair of facts to resolve.
– You never have to choose a rule, because there’s only one.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 41
By Dr. Khaled Wassif
Propositional Resolution: Review
◼ Resolution allows a complete inference mechanism
(search-based) using only one rule of inference.
◼ Resolution rule:
– Given: P1  P2  P3 … Pn, and P1  Q1 … Qm
– Conclude: P2  P3 … Pn  Q1 … Qm
Complementary literals P1 and P1 “cancel out”
◼ To prove a proposition S by resolution,
– Start with S
– Resolve with a fact from the knowledge base (that contains S)
– Repeat until all propositions have been eliminated
– If this can be done, a contradiction has been derived and the
original proposition S must be true.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 42
By Dr. Khaled Wassif
Propositional Resolution Example
◼ Rules:
– Cold  Precipitation  Snow
¬Cold  ¬Precipitation  Snow
– January  Cold
¬January  Cold
– Clouds  Precipitation
¬Clouds  Precipitation
◼ Facts:
– January, Clouds
◼ Prove:
– Snow
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 43
By Dr. Khaled Wassif
Propositional Resolution Example
¬Snow ¬Cold  ¬Precipitation  Snow

¬Cold  ¬Precipitation ¬January  Cold

¬January  ¬Precipitation ¬Clouds  Precipitation

¬January  ¬Clouds January

¬Clouds Clouds

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 44


By Dr. Khaled Wassif
FOL Resolution Theorem Proving
◼ Convert everything in the FOL knowledge base to
Conjunctive Normal Form (CNF).
– As in the propositional case, first-order resolution requires
that sentences be in CNF.
◼ Resolve, with unification of variables.
– Save bindings as you go!
◼ If resolution is successful, proof succeeds.
◼ If there was a variable in the item to prove, return
variable’s value from the unification bindings.

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 45


By Dr. Khaled Wassif
Converting FOL to CNF
1. Eliminate implication:
– x P(x)  Q(x) is equivalent to x P(x)  Q(x)
2. Move  “inwards”:
– x P(x) is equivalent to x P(x)
– x P(x) is equivalent to x P(x)
3. Standardize variables:
– x P(x)  x Q(x) becomes x P(x)  y Q(y)
4. Skolemize:
– x P(x) becomes P(A) or P(F(x)) using Skolem functions
5. Drop universal quantifiers:
– Since all quantifiers are now , we don’t need them
6. Apply distribution law ( over )
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 46
By Dr. Khaled Wassif
Converting FOL to CNF
◼ “Everyone who loves all animals is loved by someone”
x [y Animal(y) ⇒ Loves(x, y)] ⇒ [y Loves(y, x)]
1. Eliminate implication
– x [y Animal(y)  Loves(x, y)] ⇒ [y Loves(y, x)]
– x [y Animal(y)  Loves(x, y)]  [y Loves(y, x)]
2. Move  “inwards”
– x [y Animal(y)  Loves(x, y)]  [y Loves(y, x)]
– x [y Animal(y)  Loves(x, y)]  [y Loves(y, x)]
3. Standardize variables
– x [y Animal(y)  Loves(x, y)]  [z Loves(z, x)]
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 47
By Dr. Khaled Wassif
Converting FOL to CNF
◼ “Everyone who loves all animals is loved by someone”
x [y Animal(y) ⇒ Loves(x, y)] ⇒ [y Loves(y, x)]
4. Skolemize
– x [Animal(A)  Loves(x, A)]  Loves(B, x)
– x [Animal(F(x))  Loves(x, F(x))]  Loves(G(x), x)
5. Drop universal quantifiers
– [Animal(F(x))  Loves(x, F(x))]  Loves(G(x), x)
6. Apply distribution law( over )
– [Animal(F(x))  Loves(G(x), x)] 
[Loves(x, F(x))  Loves(G(x), x)]
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 48
By Dr. Khaled Wassif
Resolution with Unification
◼ The resolution rule for first-order logic is simply a
lifted version of the propositional resolution rule:
– Two clauses, which are assumed to share no variables, can
be resolved if they contain complementary literals.
– First-order literals are complementary if one unifies with
the negation of the other.
– We have:

where UNIFY(li, ¬mj) = θ


AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 49
By Dr. Khaled Wassif
Resolution with Unification
◼ Example:
– We can resolve the following two clauses:
[Animal(F(x)) ∨ Loves(G(x), x)] and [¬Loves(u, v) ∨ ¬Kills(u, v)]
– By eliminating the complementary literals:
Loves(G(x), x) and ¬Loves(u, v)
– With unifier: θ = {u/G(x), v/x}
– To produce the resolvent clause: [Animal(F(x)) ∨ ¬Kills(G(x), x)]
◼ This rule is called the binary resolution rule because
it resolves exactly two literals.
– The full resolution rule resolves subsets of literals in each
clause that are unifiable.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 50
By Dr. Khaled Wassif
Resolution: Example Knowledge Base

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 51


By Dr. Khaled Wassif
Convert to FOL, Then to CNF
1. John likes all kinds of food
2. Apples are food.
3. Chicken is food.
4. Anything that anyone eats and isn’t killed by is
food.
5. Bill eats peanuts and is still alive.
6. Sue eats everything Bill eats.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 52
By Dr. Khaled Wassif
Prove Using Resolution
1. John likes peanuts.

2. Sue eats peanuts.

3. Sue eats apples.

4. What does Sue eat?

• Translate to Sue eats X

• Result is a valid binding for X in the proof

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 53


By Dr. Khaled Wassif
Another Example
◼ Steve only likes easy courses
◼ Science courses are hard
◼ All the courses in the basket weaving
department are easy
◼ BK301 is a basket weaving course
◼ What course would Steve like?

AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 54


By Dr. Khaled Wassif
Thoughts on Resolution
◼ Resolution is sound and complete.
◼ Strategies (heuristics) for efficient resolution include:
– Unit preference: may be incomplete
» Prefer to do resolutions where one of the sentences is a unit clause.
» Favor inferences that produce shorter clauses.
– Set of support: may be incomplete
» Identify “useful” resolutions and ignore the rest.
» Every resolution involve at least an element of special set of clauses.
– Input resolution: complete
» Every resolution combines one of the input sentences (from KB or
the query) with some other sentence – used in last example.
» In Horn KBs, Modus Ponens is a kind of this strategy.
– Subsumption: complete
» Eliminate all sentences that are subsumed by (more specific than) an
existing sentence in KB.
» Keep KB small and thus helps keep the search space small.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 55
By Dr. Khaled Wassif
Resolution Summary
◼ Resolution is a single inference role which is both
sound and complete.
◼ Every sentence in KB is represented in clause form;
– Any sentence in FOL can be reduced to this clause form.
◼ Two sentences can be resolved if one contains a +ve
literal and the other contains a matching –ve literal:
– The result is a new sentence which is the disjunction of the
remaining literals in both.
◼ Resolution can be used as a relatively efficient
theorem-prover by adding to the KB the negative of
the sentence to be proved and attempting to derive a
contradiction.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 56
By Dr. Khaled Wassif
SUMMARY
◼ Inference is the process of adding information to the
knowledge base.
◼ We want inference to be:
– Sound: what we add is true if the KB is true.
– Complete: if the KB entails a sentence we can derive it.
◼ Unification identify appropriate substitutions for
variables in first-order proofs, making the process
more efficient in many cases.
◼ Forward chaining, backward chaining, and resolution
are typical inference mechanisms for first order logic.
AI: A modern Approach © 2010 S. Russell and P. Norving Slide 9 - 57
By Dr. Khaled Wassif

You might also like