0% found this document useful (0 votes)
49 views51 pages

Unit 4 Logical Reasoning

lr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views51 pages

Unit 4 Logical Reasoning

lr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

First-order logic

Some examples of first-order logic


Ajay and Vijay both know arithmetic
Knows(Ajay, arithmetic) ∧ Knows(Vijay, arithmetic)

All students know arithmetic


∀x Student(x) → Knows(x, arithmetic)
Quantifiers
Universal quantification (∀):
Think conjunction: ∀x P(x) is like P(A) ∧ P(B) ∧ · · ·
Existential quantification (∃):
Think disjunction: ∃x P(x) is like P(A) ∨ P(B) ∨ · · ·
Some properties:
¬∀x P(x) equivalent to ∃x ¬P(x)
∀x ∃y Knows(x, y) different from ∃y ∀x Knows(x, y)
Natural language quantifiers
Universal quantification (∀):
Every student knows arithmetic.
∀x Student(x)→Knows(x, arithmetic)

Existential quantification (∃):


Some student knows arithmetic.
∃x Student(x)∧Knows(x, arithmetic)
Some examples of first-order logic
There is some course that every student has
taken.
∃y Course(y) ∧ [∀x Student(x) → Takes(x, y)]

Every even integer greater than 2 is the sum of


two primes.
∀x EvenInt(x) ∧ Greater(x, 2) → ∃y ∃z Equals(x, Sum(y, z)) ∧ Prime(y) ∧ Prime(z)
Some examples of first-order logic
If a student takes a course and the course covers
a concept, then the student knows that
concept.
∀x ∀y ∀z (Student(x) ∧ Takes(x, y) ∧ Course(y) ∧ Covers(y, z)) → Knows(x, z)
Inference in first-order logic
• Inference in First-Order Logic is used to deduce new facts
or sentences from existing sentences.

Substitution:
• Substitution is a fundamental operation performed on
terms and formulas. It occurs in all inference systems in
first-order logic. The substitution is complex in the
presence of quantifiers in FOL.

If we write F[a/x], so it refers to substitute a constant "a"


in place of variable "x".
Inference in first-order logic
Equality:
• First-Order logic does not only use predicate and
terms for making atomic sentences but also uses
another way, which is equality in FOL. For this, we can
use equality symbols which specify that the two
terms refer to the same object.
Example: Brother (John) = Smith.
• The equality symbol can also be used with negation
to represent that two terms are not the same objects.
Example: ¬(x=y) which is equivalent to x ≠y.
FOL inference rules for quantifier

As propositional logic we also have inference rules


in first-order logic, so following are some basic
inference rules in FOL:
• Universal Generalization
• Universal Instantiation
• Existential Instantiation
• Existential introduction
FOL inference rules for quantifier
1. Universal Generalization:
Universal generalization is a valid inference rule which states that if
premise P(c) is true for any arbitrary element c in the universe of
discourse, then we can have a conclusion as ∀ x P(x).

It can be represented as:

This rule can be used if we want to show that every element has a
similar property.

In this rule, x must not appear as a free variable.

Example: Let's represent, P(c): "A byte contains 8 bits", so for ∀ x


P(x) "All bytes contain 8 bits.", it will also be true.
FOL inference rules for quantifier
• Universal instantiation is also called as universal elimination or UI
is a valid inference rule. It can be applied multiple times to add
new sentences.
• The new KB is logically equivalent to the previous KB.
• As per UI, we can infer any sentence obtained by substituting a
ground term for the variable.
• The UI rule state that we can infer any sentence P(c) by
substituting a ground term c (a constant within domain x) from ∀
x P(x) for any object in the universe of discourse.

• It can be represented as:


FOL inference rules for quantifier
Example:
Let's take a famous example,
"All kings who are greedy are Evil." So let our knowledge base
contains this detail as in the form of FOL:
∀x king(x) ∧ greedy (x) → Evil (x)
So from this information, we can infer any of the following
statements using Universal Instantiation:

King(John) ∧ Greedy (John) → Evil (John),


King(Richard) ∧ Greedy (Richard) → Evil (Richard),
King(Father(John)) ∧ Greedy (Father(John)) → Evil (Father(John))
FOL inference rules for quantifier
3. Existential Instantiation:
• Existential instantiation is also called as Existential Elimination,
which is a valid inference rule in first-order logic.
• It can be applied only once to replace the existential sentence.
• The new KB is not logically equivalent to old KB, but it will be
satisfiable if old KB was satisfiable.
• This rule states that one can infer P(c) from the formula given in
the form of ∃x P(x) for a new constant symbol c.
• The restriction with this rule is that c used in the rule must be a
new term for which P(c ) is true.

• It can be represented as:


FOL inference rules for quantifier
Example:
From the given sentence: ∃x Crown(x) ∧ OnHead(x, John)

So we can infer: Crown(K) ∧ OnHead( K, John), as long as K does not


appear in the knowledge base.

The above used K is a constant symbol, which is called


Skolem constant.

The Existential instantiation is a special case of Skolemization


process.
FOL inference rules for quantifier
4. Existential introduction
• An existential introduction is also known as an existential
generalization, which is a valid inference rule in first-order logic.
• This rule states that if there is some element c in the universe of
discourse which has a property P, then we can infer that there
exists something in the universe which has the property P.

• It can be represented as:

Example: Let's say that,


"Priyanka got good marks in English."
"Therefore, someone got good marks in English."
Generalized Modus Ponens Rule
For the inference process in FOL, we have a single inference rule
which is called Generalized Modus Ponens. It is lifted version of
Modus ponens.

Generalized Modus Ponens can be summarized as, " P implies Q


and P is asserted to be true, therefore Q must be True."

According to Modus Ponens, for atomic sentences pi, pi', q. Where


there is a substitution θ such that SUBST (θ, pi',) = SUBST(θ, pi), it
can be represented as:
Generalized Modus Ponens Rule
Example:
We will use this rule for Kings are evil, so we will find some x such
that x is king, and x is greedy so we can infer that x is evil.

Here let say,


p1' is king(John) p1 is king(x)
p2' is Greedy(y) p2 is Greedy(x)
θ is {x/John, y/John} q is evil(x)
SUBST(θ , q)
Unification in FOL
• Unification is a process of making two different logical atomic
expressions identical by finding a substitution. Unification
depends on the substitution process.
• It takes two literals as input and makes them identical using
substitution.
• Let α and β be two atomic sentences and ? be a unifier such
that, α? = β?, then it can be expressed as UNIFY(α, β).
Example:
Find the MGU(most general unifier) for Unify{King(x), King(John)}
Let α = King(x), β = King(John),
Substitution θ = {John/x} is a unifier for these atoms and applying
this substitution, and both expressions will be identical.
Unification in FOL
Conditions for Unification:
Following are some basic conditions for unification:
• Predicate symbol must be same, atoms or expression with
different predicate symbol can never be unified.
• Number of Arguments in both expressions must be identical.
• Unification will fail if there are two similar variables present in
the same expression.
Unification in FOL
1. Find the MGU of {p(f(a), g(Y)) and p(X, X)}
Sol: S0 => Here, α = p(f(a), g(Y)), and β = p(X, X)
SUBST θ= {f(a) / X}
S1 => α = p(f(a), g(Y)), and β = p(f(a), f(a))
SUBST θ= {f(a) / g(y)}, Unification failed.

2. Find Here, Ψ1 = p(b, X, f(g(Z))) , and Ψ2 = p(Z, f(Y), f(Y))


S0 => { p(b, X, f(g(Z))); p(Z, f(Y), f(Y))}
SUBST θ={b/Z} => {p(b, X, f(g(Z))) and p(Z, f(Y), f(Y))}
Unification in FOL
S1 => { p(b, X, f(g(b))); p(b, f(Y), f(Y))}
SUBST θ={f(Y) /X}
S2 => { p(b, f(Y), f(g(b))); p(b, f(Y), f(Y))}
SUBST θ= {g(b) /Y}
S2 => { p(b, f(g(b)), f(g(b)); p(b, f(g(b)), f(g(b))} Unified Successfully.
And Unifier = { b/Z, f(Y) /X , g(b) /Y}.

3. UNIFY(knows(Richard, x), knows(Richard, John))


Here, Ψ1 = knows(Richard, x), and Ψ2 = knows(Richard, John)
S0 => { knows(Richard, x); knows(Richard, John)}
SUBST θ= {John/x}
S1 => { knows(Richard, John); knows(Richard, John)}, Successfully
Unified.
Unifier: {John/x}.
The unification algorithm. The arguments x and y can be any expression: a constant or variable, or a
compound expression such as a complex sentence or term, or a list of expressions. The argument θ is a
substitution, initially the empty substitution, but with {var/val} pairs added to it as we recurse through
the inputs, comparing the expressions element by element. In a compound expression such as F(A,B),
OP(x) field picks out the function symbol F and ARGS(x) field picks out the argument list (A,B).
Example
• The law says that it is a crime for an American
to sell weapons to hostile nations. The
country Nono, an enemy of America, has
some missiles, and all of its missiles were sold
to it by Colonel West, who is American.

• Prove that Colonel West is a criminal


... it is a crime for an American to sell weapons to hostile nations:
American(x)  Weapon(y)  Sells(x,y,z)  Hostile(z)  Criminal(x)
Nono … has some missiles, i.e., x Owns(Nono,x)  Missile(x):
Owns(Nono,M1) and Missile(M1)
… all of its missiles were sold to it by Colonel West
Missile(x)  Owns(Nono,x)  Sells(West,x,Nono)
Missiles are weapons:
Missile(x)  Weapon(x)
An enemy of America counts as "hostile“:
Enemy(x,America)  Hostile(x)
West, who is American …
American(West)
The country Nono, an enemy of America …
Enemy(Nono,America)
Forward Chaining
• Forward chaining is also known as a forward deduction
or forward reasoning method when using an inference
engine.

• Forward chaining is a form of reasoning which start with


atomic sentences in the knowledge base and applies
inference rules (Modus Ponens) in the forward direction
to extract more data until a goal is reached.

• The Forward-chaining algorithm starts from known facts,


triggers all rules whose premises are satisfied, and add
their conclusion to the known facts. This process repeats
until the problem is solved.
Properties of Forward-Chaining
• It is a down-up approach, as it moves from bottom to
top.
• It is a process of making a conclusion based on
known facts or data, by starting from the initial state
and reaches the goal state.
• Forward-chaining approach is also called as data-
driven as we reach to the goal using available data.
• Forward -chaining approach is commonly used in the
expert system, business, and production rule
systems.
Forward Chaining Algorithm
Forward Chaining Example

28
Forward Chaining Example

Missile(x)  Weapon(x) Enemy(x,America)  Hostile(x)

Missile(x)  Owns(Nono,x) 
Sells(West,x,Nono)
29
Forward Chaining Example

American(x)  Weapon(y)  Sells(x,y,z)  Hostile(z)  Criminal(x)

30
Backward Chaining
• Backward-chaining is also known as a backward
deduction or backward reasoning method
when using an inference engine.
• A backward chaining algorithm is a form of
reasoning, which starts with the goal and works
backward, chaining through rules to find known
facts that support the goal.

31
Properties of Backward Chaining
• It is known as a top-down approach.
• Backward-chaining is based on modus ponens inference
rule.
• In backward chaining, the goal is broken into sub-goal or
sub-goals to prove the facts true.
• It is called a goal-driven approach, as a list of goals
decides which rules are selected and used.
• Backward -chaining algorithm is used in game theory,
automated theorem proving tools, inference engines,
proof assistants, and various AI applications.
• The backward-chaining method mostly used a depth-first
search strategy for proof.

32
Backward Chaining Algorithm
Backward Chaining Example

34
Backward Chaining Example

35
Backward Chaining Example

36
Backward Chaining Example

37
Backward Chaining Example

38
Backward Chaining Example

39
About Backward Chaining
• Uses composition of substitutions
• A DFS algorithm
– Linear time
– Repeated states and incompleteness

40
Resolution

• Resolution proves that KB |= α by proving that


KB∧¬α unsatisfiable—that is, by deriving the
empty clause.
• Resolution is a theorem proving technique
that proceeds by building refutation proofs,
i.e., proofs by contradictions. It was invented
by a Mathematician John Alan Robinson in the
year 1965.
Resolution
The first step is to convert sentences to conjunctive
normal form (CNF)—that is, a conjunction of clauses,
where each clause is a disjunction of literals.
In CNF, literals can contain variables, which are assumed
to be universally quantified
• For example, the sentence
∀x, y, z American(x)∧Weapon (y)∧Sells(x,y, z)∧Hostile(z) ⇒
Criminal(x)
becomes, in CNF,
¬American(x)∨¬Weapon(y)∨¬Sells(x,y,z) ∨ ¬ Hostile(z)
∨ Criminal(x)
Steps
Eliminate implications
Replace P ⇒Q with ¬P∨Q
Move ¬ inwards
¬∀x p becomes ∃x ¬p
¬∃x p becomes ∀x ¬p
Standardize variables: For sentences like (∃xP(x)) ∨
(∃xQ(x)) that use the same variable name twice,
change the name of one of the variables. This avoids
confusion later when we drop the quantifiers. Thus,
we have
∀x [∃y Animal(y)∧¬Loves(x,y)]∨[∃z Loves(z,x)]
Steps
• Skolemize: Skolemization is the process of removing
existential quantifiers by elemination.
• Drop universal quantifiers: At this point, all
remaining variables must be universally quantified.
Therefore, we don’t lose any information if we drop
the quantifier:
[Animal(F(x))∧¬Loves(x,F(x))]∨Loves(G(x),x) .
• Distribute ∨ over ∧:
[Animal(F(x))∨Loves(G(x),x)]∧[¬Loves(x,F(x))∨Loves(G(
x),x)] .
The sentences in CNF are:
(west is criminal example revisited)
¬American(x)∨ ¬Weapon(y)∨ ¬Sells(x,y,z)∨ ¬Hostile(z)∨Criminal(x)
¬Missile(x)∨¬Owns(Nono,x)∨Sells(West,x,Nono)
¬Enemy(x,America)∨ Hostile(x)
¬Missile(x)∨ Weapon(x)
Owns(Nono,M1)
Missile(M1)
American(West)
Enemy(Nono,America).
¬American(x)∨ ¬Weapon(y)∨ ¬Sells(x,y,z)∨ ¬Hostile(z)∨Criminal(x)
Resolution proof ¬ Criminal(West)

American(West) ¬American(West)∨ ¬Weapon(y)∨ ¬Sells(West,y,z)∨ ¬Hostile(z)

¬Missile(x)∨ Weapon(x) ¬Weapon(y)∨ ¬Sells(West,y,z)∨ ¬Hostile(z)

Missile(M1) ¬Missile(y)∨ ¬Sells(West,y,z)∨ ¬Hostile(z)

¬Missile(x)∨¬Owns(Nono,x)∨Sells(West,x,Nono) ¬Sells(West,M1,z)∨ ¬Hostile(z)

Missile(M1) ¬Missile(M1)∨¬Owns(Nono,M1)∨¬Hostile(Nono)

Owns(Nono,M1) ¬Owns(Nono,M1)∨¬Hostile(Nono)

¬Enemy(x,America)∨ Hostile(x) ¬Hostile(Nono)

Enemy(Nono,America) ¬Enemy(Nono,America)
A Resolution proof that West is a Criminal.
At each step, the literals that unify are in bold A Resolution proof that West is a Criminal
Example 2
Everyone who loves all animals is loved by
someone. Anyone who kills an animal is
loved by no one. Jack loves all animals. Either
Jack or Curiosity killed the cat, who is named
Tuna. Did Curiosity kill the cat?
First, we express the original sentences, some
background knowledge, and the negated goal G in first-
order logic
1) ∀x [∀y Animal(y) ⇒ Loves(x,y)] ⇒ [∃y Loves(y,x)]
2) ∀x [∃z Animal(z)∧Kills(x,z)] ⇒ [∀y ¬Loves(y,x)]
3) ∀x Animal(x) ⇒ Loves(Jack,x)
4) Kills(Jack,Tuna)∨Kills(Curiosity,Tuna)
5) Cat(Tuna)
6) ∀x Cat(x) ⇒ Animal(x)
7) ¬G. ¬Kills(Curiosity,Tuna)
Now we apply the conversion procedure to
convert each sentence to CNF:
1) Animal(F(x))∨Loves(G(x),x)
2) ¬Loves(x,F(x))∨Loves(G(x),x)
3) ¬Loves(y,x)∨ ¬Animal(z)∨ ¬Kills(x,z)
4) ¬Animal(x)∨Loves(Jack,x)
5) Kills(Jack,Tuna)∨Kills(Curiosity,Tuna)
6) Cat(Tuna)
7) ¬Cat(x)∨Animal(x)
8) ¬Kills(Curiosity,Tuna)
In English, the proof could be
paraphrased as follows:
Suppose Curiosity did not kill Tuna. We know
that either Jack or Curiosity did; thus Jack must
have. Now, Tuna is a cat and cats are animals, so
Tuna is an animal. Because anyone who kills an
animal is loved by no one, we know that no one
loves Jack. On the other hand, Jack loves all
animals, so someone loves him; so we have a
contradiction. Therefore, Curiosity killed the cat.
Resolution Tree

You might also like