0% found this document useful (0 votes)
16 views97 pages

Slides Kbagents

The document discusses logical agents in artificial intelligence, focusing on knowledge-based agents, their actions, and the concept of entailment in logic. It uses examples like the Wumpus World and the Simpsons family to illustrate how knowledge bases can be constructed and queried. The document also covers propositional logic syntax and the process of generating sentences using logical operators.

Uploaded by

Surya Basnet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views97 pages

Slides Kbagents

The document discusses logical agents in artificial intelligence, focusing on knowledge-based agents, their actions, and the concept of entailment in logic. It uses examples like the Wumpus World and the Simpsons family to illustrate how knowledge bases can be constructed and queried. The document also covers propositional logic syntax and the process of generating sentences using logical operators.

Uploaded by

Surya Basnet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 97

Logical Agents

CS335 Introduction to AI
Logical Agents

Francisco Iacobelli

Department of Computer Science


Northeastern Illinois University

July 21, 2021

[email protected] (@neiu.edu) AI July 21, 2021 1 / 70


Logical Agents
Example

Mine Sweep

Hunt the Wumpus Slightly modified version of the original:(Gregory Yob, circa 1973)

[email protected] (@neiu.edu) AI July 21, 2021 2 / 70


Basic Actions
Ask/Tell

A knowledge base keeps track of things


We can TELL it facts or ASK for inference
For example:
TELL: Father of John is Bob
TELL: Jane is John’s sister
TELL: John’s Father is the same as John’s sister’s father
ASK: Who’s Jane’s Father?

[email protected] (@neiu.edu) AI July 21, 2021 3 / 70


Knowledge Based Agents
Concepts

Knowledge Base
Knowledge representation language
Inference
Have background knowledge about the world

[email protected] (@neiu.edu) AI July 21, 2021 4 / 70


Knowledge Based Agents
Concepts

At every step:
Construct a sentence with assertion about percepts
Construct a sentence asking what action is next
Constructs a sentence asserting that action

[email protected] (@neiu.edu) AI July 21, 2021 5 / 70


Wumpus World
A Dangerous Grid

Adjacent rooms are connected (horizontally or vertically)


Lurking in the cave is the Wumpus
Player can smell the Wumpus (stench)
Player feels a breeze if pit nearby
Player can shoot ONE arrow at (and kill) the Wumpus
Some rooms contain pits that will trap player
One room contains a pot of gold (Yay!)

[email protected] (@neiu.edu) AI July 21, 2021 6 / 70


Wumpus World
Formulation

[email protected] (@neiu.edu) AI July 21, 2021 7 / 70


Wumpus World
Formulation

PEAS:
Performance measure: +1000 for walk out w/gold; -1000 for
dying; -1 for each action, -10 for arrow
Environment a 4 × 4 grid. Agent starts at [1,1]; gold and pits
randomly distributed, etc..
Actuators Agent can move forward, left or right, grab, shoot
Sensors: {[Smell, Breeze, Glitter , Bump, Scream]}

[email protected] (@neiu.edu) AI July 21, 2021 8 / 70


Wumpus: Inference
Starting State

[None,None,None,None,None]

[email protected] (@neiu.edu) AI July 21, 2021 9 / 70


Wumpus: Inference
Next State

[None,Breeze,None,None,None]

[email protected] (@neiu.edu) AI July 21, 2021 10 / 70


Wumpus: Inference
Third’s move’s state

[Stench,None,None,None,None]

[email protected] (@neiu.edu) AI July 21, 2021 11 / 70


Wumpus: Inference
Fifth’s move’s state

[Stench,Breeze,Glitter,None,None]

[email protected] (@neiu.edu) AI July 21, 2021 12 / 70


Logic
The Language we use to solve these kinds of problems

A Logic has syntax: e.g. x + 4 = 6 3 ; 4x = 6+ 7


Semantics define the truth of a sentence
Models describe possible worlds and variable assignments for that
world. M(α) is the set of all models (variable assignments) of
sentence α
If sentence α is true for model m, α is satisfied by m

[email protected] (@neiu.edu) AI July 21, 2021 13 / 70


Example

Consider the Simpsons family: Homer(H) is married to Marge(M) and


they have 3 three children: Lisa(L), Bart(B) and Maggie(G).

Say α is (Xi , Xj )∀X = The Simpsons such that Xi daughterOf Xj .

M(α) = {(H, M), (H, L), (H, B), (H, G), (M, H), (M, B), . . . , (L, H), (L, M), (L, L), (L, B), . . . , (G, G)}

Example models m.
m1 = (H, M); α = H daughterOf M.
m2 = (L, H); α = L daughterOf H.
m that satisfy αs (A.K.A. make m True): {(L, H), (L, M), (G, H), (G, M)}

[email protected] (@neiu.edu) AI July 21, 2021 14 / 70


Logic
Entailment

Entailment is when a sentence follows from another α |= β


α |= β iff in every model where α is true, β is also true.
α |= β iff M(α) ⊆ M(β).
examples:
1 (x = 0) |= (xy = 0)

[email protected] (@neiu.edu) AI July 21, 2021 15 / 70


Logic
Entailment

Entailment is when a sentence follows from another α |= β


α |= β iff in every model where α is true, β is also true.
α |= β iff M(α) ⊆ M(β).
examples:
1 (x = 0) |= (xy = 0)
2 (p = True) |= (p ∨ q)

[email protected] (@neiu.edu) AI July 21, 2021 15 / 70


Logic
Entailment

Entailment is when a sentence follows from another α |= β


α |= β iff in every model where α is true, β is also true.
α |= β iff M(α) ⊆ M(β).
examples:
1 (x = 0) |= (xy = 0)
2 (p = True) |= (p ∨ q)
3 (p ∧ q) |= (p ∨ q)

[email protected] (@neiu.edu) AI July 21, 2021 15 / 70


Logic
Entailment

Entailment is when a sentence follows from another α |= β


α |= β iff in every model where α is true, β is also true.
α |= β iff M(α) ⊆ M(β).
examples:
1 (x = 0) |= (xy = 0)
2 (p = True) |= (p ∨ q)
3 (p ∧ q) |= (p ∨ q)
4 ((q ⇒ p) ∨ r ) |= (q ⇒ p)

[email protected] (@neiu.edu) AI July 21, 2021 15 / 70


Logic
Entailment

Entailment is when a sentence follows from another α |= β


α |= β iff in every model where α is true, β is also true.
α |= β iff M(α) ⊆ M(β).
examples:
1 (x = 0) |= (xy = 0)
2 (p = True) |= (p ∨ q)
3 (p ∧ q) |= (p ∨ q)
4 ((q ⇒ p) ∨ r ) |= (q ⇒ p)
5 (q ⇒ p) |= ((q ⇒ p) ∨ r )?

[email protected] (@neiu.edu) AI July 21, 2021 15 / 70


Logic
Entailment

Entailment is when a sentence follows from another α |= β


α |= β iff in every model where α is true, β is also true.
α |= β iff M(α) ⊆ M(β).
examples:
1 (x = 0) |= (xy = 0)
2 (p = True) |= (p ∨ q)
3 (p ∧ q) |= (p ∨ q)
4 ((q ⇒ p) ∨ r ) |= (q ⇒ p)
5 (q ⇒ p) |= ((q ⇒ p) ∨ r )?
6 x, y s.t. x, y ∈ {The Office} ∧ x 6= y |= x, y s.t. x is a coworker of y

[email protected] (@neiu.edu) AI July 21, 2021 15 / 70


Logic
Entailment

Entailment is when a sentence follows from another α |= β


α |= β iff in every model where α is true, β is also true.
α |= β iff M(α) ⊆ M(β).
examples:
1 (x = 0) |= (xy = 0)
2 (p = True) |= (p ∨ q)
3 (p ∧ q) |= (p ∨ q)
4 ((q ⇒ p) ∨ r ) |= (q ⇒ p)
5 (q ⇒ p) |= ((q ⇒ p) ∨ r )?
6 x, y s.t. x, y ∈ {The Office} ∧ x 6= y |= x, y s.t. x is a coworker of y
7 False |= True

[email protected] (@neiu.edu) AI July 21, 2021 15 / 70


Using Entailment to Get Answers
Is Mary a good Roomate?

Mary can be described by cleanness and dependability. From 1:very to 3:not


at all.1

Friends told me that Mary is not messy and she is always dependable. I also
know that Good rommates score 1 or 2 on cleanness and dependability.
Does Mary (KB) |= Good Roomate(α) .

(2,1) (2,2)
KB (1,1) α1

(1,2)

1
For example, a 1 on cleanness means he is very clean. a 3 means he is not at all
clean. (a, b) means a score of a on cleanness and b on dependability
[email protected] (@neiu.edu) AI July 21, 2021 16 / 70
Using Entailment to Get Answers
Is John a good Roomate?

John can also be described by cleanness and dependability. From 1:very to


3:not at all.2

Through friends, I know John is not messy and he is not always dependable. I
also know that Good rommates score 1 or 2 on cleanness and dependability .

(2,3)
(2,2) (1,1)
KB (1,2) α1
(1,3)
(2,1)

2
For example, a 1 on cleanness means he is very clean. a 3 means he is not at all
clean. (a, b) means a score of a on cleanness and b on dependability
[email protected] (@neiu.edu) AI July 21, 2021 17 / 70
Logic
Answer me this:Is it true that there are no pits in [1,2]?

Model the presence of pits in squares [1,2][2,2] and [3,1]. α1 models


no pits in [1,2]. Then, does KB |= α1 ?

In every model in which KB is true, α1 is also true


[email protected] (@neiu.edu) AI July 21, 2021 18 / 70
Logic
Answer me this:Is it true that there are no pits in [2,2]?

Model the presence of pits in squares [1,2][2,2] and [3,1] with the
restriction α2 of no pits in [2,2]. Does KB |= α2 ?

In every model in which KB is true, α2 is not necessarily true.

[email protected] (@neiu.edu) AI July 21, 2021 19 / 70


Propositional Logic
Syntax

Let S = Sentence, AS = Atomic Sentence, CS = Complex Sentence


→ AS|CS S
→ True|False|P|Q|R| . . .
AS
→ (S)|[S] CS
| ¬S
| S∧S
| S∨S
| S⇒S
| S⇔S
Operator Precedence3 : ¬, ∧, ∨, ⇒, ↔
Is the sentence (P ∨ Q) ∧ R valid?

3
Otherwise the grammar is ambiguous
[email protected] (@neiu.edu) AI July 21, 2021 20 / 70
Syntax
Using The Grammar, generation

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using The Grammar, generation

CS

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using The Grammar, generation

CS

S ∧ S

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using The Grammar, generation

CS

S ∧ S

CS AS

( S )

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using The Grammar, generation

CS

S ∧ S

CS AS

( S ) R

CS

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using The Grammar, generation

CS

S ∧ S

CS AS

R
( S )

CS

S ∨ S

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using The Grammar, generation

CS

S ∧ S

CS AS

R
( S )

CS

S ∨ S

AS AS

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using The Grammar, generation

CS

S ∧ S

CS AS

R
( S )

CS

S ∨ S

AS AS

P Q

looking at leaf nodes we get (P ∨ Q) ∧ R

[email protected] (@neiu.edu) AI July 21, 2021 21 / 70


Syntax
Using the Grammar, validation

Is P ∧ Q ∨ ¬R a valid sentence?

[email protected] (@neiu.edu) AI July 21, 2021 22 / 70


Exercise
Are these valid sentences in the previous grammar?

That is, can you find a parse tree that generates these sentences?
R ∧ True ↔ P
R ∧ PQ ∨ True
([P ∧ ¬Q] ∧ R) ∨ ¬Q
True
¬P ∧ ¬¬Q

[email protected] (@neiu.edu) AI July 21, 2021 23 / 70


Semantics

Sentences have a Truth value with respect to a model


For example: m = {P1,2 = false, P2,2 = false, P2,1 = True}
Or: m = {P1,2 = false, P2,2 = true, P2,1 = false}
P1,2 is just a symbol. It can mean anything.
Truth value is computed recursively according to...

[email protected] (@neiu.edu) AI July 21, 2021 24 / 70


Semantics
Basic Rules

¬P is true if P is false in m (negation)


P ∧ Q is true iff both P and Q are true in m (conjunction)
P ∨ Q is true iff either P or Q are true in m (disjunction)
P ⇒ Q is true unless P is true and Q is false (implication)4
P ↔ Q is true iff P and Q are both true or both false5
(biconditional)

4
if P is true I claim that Q is true. Otherwise no claim
5
if P is true I claim that Q is true, if P is false I claim that Q is false. otherwise no
claim
[email protected] (@neiu.edu) AI July 21, 2021 25 / 70
Semantics
Can be expressed as Truth tables

P Q P ∧Q
T T T
Example: T F F
F T F
F F F

[email protected] (@neiu.edu) AI July 21, 2021 26 / 70


Semantics
Evaluating

in the model m = {P1,2 = false, P2,2 = false, P2,1 = True}

Evaluate ¬P1,2 ∧ P2,2 ∨ P2,1

Evaluate it for m = {P1,2 = true, P2,2 = true, P2,1 = false}

[email protected] (@neiu.edu) AI July 21, 2021 27 / 70


A Simple KB
Definitions

Px,y is true if there’s a pit in [x, y ]


Wx,y is true if there is a Wumpus in [x, y ]
Bx,y is true if the agent perceives a breeze in [x, y ]
Sx,y is true if the agent perceives a stench in [x, y ]

[email protected] (@neiu.edu) AI July 21, 2021 28 / 70


A simple KB
Rules

For the Wumpus world in general.


R1 : ¬P1,1
R2 : B1,1 ↔ (P1,2 ∨ P2,1 )
R3 : B2,1 ↔ (P1,1 ∨ P2,2 ∨ P3,1 )

Now, after visiting [1,1]; [1,2] and [2,1]


R4 : ¬B1,1
R5 : B2,1

KB = R1 ∧ R2 ∧ R3 ∧ R4 ∧ R5

[email protected] (@neiu.edu) AI July 21, 2021 29 / 70


Inference
Goal

I want to find whether my KB says there’s no pit in [1,2]

That is, does KB |= ¬P1,2 ?

We say that ¬P1,2 is a sentence α

Main goal: decide whether KB |= α

α can be a much more complex query

[email protected] (@neiu.edu) AI July 21, 2021 30 / 70


Inference
Simple Method

enumerate the models


for each model, check that:
if it is true in α is has to be true in KB

In the Wumpus world: 7 relevant symbols:


B1,1 , B2,1 , P1,1 , P1,2 , P2,1 , P2,2 , P3,1
27 = 128 models. Only 3 are true

[email protected] (@neiu.edu) AI July 21, 2021 31 / 70


Inference
All Possible Models

Truth tables for inference


B1,1 B2,1 P1,1 P1,2 P2,1 P2,2 P3,1 R1 R2 R3 R4 R5 KB
false false false false false false false true true true true false false
false false false false false false true true true false true false false
... ... ... ... ... ... ... ... ... ... ... ... ...
false true false false false false false true true false true true false
false true false false false false true true true true true true true
false true false false false true false true true true true true true
false true false false false true true true true true true true true
false true false false true false false true false false true true false
... ... ... ... ... ... ... ... ... ... ... ... ...
true true true true true true true false true true false true false

Enumerate rows (different assignments to symbols),


if KB is true in row, check that α is too

[email protected] (@neiu.edu) AI July 21, 2021 32 / 70


Inference
All Possible Models

Truth tables for inference


B1,1 B2,1 P1,1 P1,2 P2,1 P2,2 P3,1 R1 R2 R3 R4 R5 KB
false false false false false false false true true true true false false
false false false false false false true true true false true false false
... ... ... ... ... ... ... ... ... ... ... ... ...
false true false false false false false true true false true true false
false true false false false false true true true true true true true
false true false false false true false true true true true true true
false true false false false true true true true true true true true
false true false false true false false true false false true true false
... ... ... ... ... ... ... ... ... ... ... ... ...
true true true true true true true false true true false true false

Enumerate rows (different Does KB |=to¬P


assignments 1,1 ?
symbols),
if KB is true in row, check that α is too

[email protected] (@neiu.edu) AI July 21, 2021 32 / 70


Inference
TT-Entails

function TT-Entails(KB,q) //q is the query in prop. logic


symbols=list of the proposition symbols in KB and q
return TT-Check-All(KB,q,symbols,{})

function TT-Check-All(KB,q,symbols,model)
if isEmpty(symbols)
if PL-True(KB,model)
return PL-True(q,model)
else
return true // if KB is false, always return true
else do
P=First(symbols)
rest=Rest(symbols)
return (TT-Check-All(KB,q,rest,model + {P=true}) AND
(TT-Check-All(KB,q,rest,model + {P=false})

function PL-True(sentence, model)


//returns true if sentence holds within the model

[email protected] (@neiu.edu) AI July 21, 2021 33 / 70


Inference
Model Checking Complexity

if KB and α contain n binary symbols in all:

Time complexity: O(2n )

Space complexity: O(n) because it is depth first.

propositional entailment is co-NP complete (probably no easier than


NP-Complete)

[email protected] (@neiu.edu) AI July 21, 2021 34 / 70


Inference Logical equivalence
6
Logical Equivalences
Two sentences are logically equivalent iff true in same models:
α ≡ β if and only if α |= β and β |= α

(α ∧ β) ≡ (β ∧ α) commutativity of ∧
(α ∨ β) ≡ (β ∨ α) commutativity of ∨
((α ∧ β) ∧ γ) ≡ (α ∧ (β ∧ γ)) associativity of ∧
((α ∨ β) ∨ γ) ≡ (α ∨ (β ∨ γ)) associativity of ∨
¬(¬α) ≡ α double-negation elimination
(α ⇒ β) ≡ (¬β ⇒ ¬α) contraposition
(α ⇒ β) ≡ (¬α ∨ β) implication elimination
(α ⇔ β) ≡ ((α ⇒ β) ∧ (β ⇒ α)) biconditional elimination
¬(α ∧ β) ≡ (¬α ∨ ¬β) De Morgan
¬(α ∨ β) ≡ (¬α ∧ ¬β) De Morgan
(α ∧ (β ∨ γ)) ≡ ((α ∧ β) ∨ (α ∧ γ)) distributivity of ∧ over ∨
(α ∨ (β ∧ γ)) ≡ ((α ∨ β) ∧ (α ∨ γ)) distributivity of ∨ over ∧

6 Chapter 7 39
There are many more, but these are the main ones
[email protected] (@neiu.edu) AI July 21, 2021 35 / 70
Inference By Theorem Proving
Concepts

Logical Equivalence: α ≡ β iff α |= β and β |= α


Validity: A tautology: it is true in all models. e.g. P ∨ ¬P
Deduction: α |= β iff α ⇒ β
Satisfiability: if some model makes it true.

[email protected] (@neiu.edu) AI July 21, 2021 36 / 70


Inference By Theorem Proving
Proofs

Modus Ponens

α ⇒ β, α
β
If α implies β and α is true, then β is true

And Elimination

α∧β
α

[email protected] (@neiu.edu) AI July 21, 2021 37 / 70


Inference by Theorem Proving
Proofs

Other rules can also be inference rules

α ⇐⇒ β
(α ⇒ β) ∧ (β ⇒ α)

(α ⇒ β) ∧ (β ⇒ α)
α ⇐⇒ β

[email protected] (@neiu.edu) AI July 21, 2021 38 / 70


Inference
In our Wumpus World: Is there a pit in 1,2?

R1 : ¬P1,1
R2 : B1,1 ⇐⇒ (P1,2 ∨ P2,1 )
R3 : B2,1 ⇐⇒ (P1,1 ∨ P2,2 ∨ P3,1 )
R4 : ¬B1,1
R5 : B2,1

[email protected] (@neiu.edu) AI July 21, 2021 39 / 70


Inference
Applied to the Wumpus World

We have KB = R1 ∧ R2 ∧ R3 ∧ R4 ∧ R5 . We want to prove ¬P1,2

[email protected] (@neiu.edu) AI July 21, 2021 40 / 70


Inference
Applied to the Wumpus World

We have KB = R1 ∧ R2 ∧ R3 ∧ R4 ∧ R5 . We want to prove ¬P1,2

R6 : (B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by bicond. elim


R2

[email protected] (@neiu.edu) AI July 21, 2021 40 / 70


Inference
Applied to the Wumpus World

We have KB = R1 ∧ R2 ∧ R3 ∧ R4 ∧ R5 . We want to prove ¬P1,2

R6 : (B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by bicond. elim


R2
R7 : ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by And-Elimination to R6

[email protected] (@neiu.edu) AI July 21, 2021 40 / 70


Inference
Applied to the Wumpus World

We have KB = R1 ∧ R2 ∧ R3 ∧ R4 ∧ R5 . We want to prove ¬P1,2

R6 : (B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by bicond. elim


R2
R7 : ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by And-Elimination to R6
R8 : (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1 )) by contrapositives

[email protected] (@neiu.edu) AI July 21, 2021 40 / 70


Inference
Applied to the Wumpus World

We have KB = R1 ∧ R2 ∧ R3 ∧ R4 ∧ R5 . We want to prove ¬P1,2

R6 : (B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by bicond. elim


R2
R7 : ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by And-Elimination to R6
R8 : (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1 )) by contrapositives
R9 : ¬(P1,2 ∨ P2,1 ) by Modus Ponens with R8 and R4

[email protected] (@neiu.edu) AI July 21, 2021 40 / 70


Inference
Applied to the Wumpus World

We have KB = R1 ∧ R2 ∧ R3 ∧ R4 ∧ R5 . We want to prove ¬P1,2

R6 : (B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by bicond. elim


R2
R7 : ((P1,2 ∨ P2,1 ) ⇒ B1,1 ) by And-Elimination to R6
R8 : (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1 )) by contrapositives
R9 : ¬(P1,2 ∨ P2,1 ) by Modus Ponens with R8 and R4
R10 : ¬P1,2 ∧ ¬P2,1
That is: Neither [1,2] nor [2,1] contains a pit.

[email protected] (@neiu.edu) AI July 21, 2021 40 / 70


Inference
As Search

Intial State: The initial Knowledge Base


Actions: The set of all the inference rules applied to all sentences
that match top half
Result: Add sentence in the bottom half of the inference rule
Goal: The goal is a state that contains sentence we want to prove

[email protected] (@neiu.edu) AI July 21, 2021 41 / 70


Inference
By Resolution

Let’s say agent returns to [1,1] from [2,1] and goes to [1,2]

We add:

R11 : ¬B1,2
R12 : B1,2 ⇐⇒ (P1,1 ∨ P2,2 ∨ P1,3 )

[email protected] (@neiu.edu) AI July 21, 2021 42 / 70


Inference
By Resolution

We can continue using same process as earlier.


R13 : ¬P2,2 Contrapositive R12 and AND elimination
R14 : ¬P1,3 Same as above.
R15 : P1,1 ∨ P2,2 ∨ P3,1 bi-conditional elem. R3 and modus ponens
R5
And the literal ¬P2,2 in R13 resolves with P2,2 in R15 to give the
resolvent
R16 : P1,1 ∨ P3,1
more generally...
A ∨ B, ¬A ∨ C
B∨C
Anything else that resolves?

[email protected] (@neiu.edu) AI July 21, 2021 43 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )
Eliminate ⇒ by replacing α ⇒ β with ¬α ∨ β

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )
Eliminate ⇒ by replacing α ⇒ β with ¬α ∨ β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬(P1,2 ∨ P2,1 ) ∨ B1,1 )

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )
Eliminate ⇒ by replacing α ⇒ β with ¬α ∨ β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬(P1,2 ∨ P2,1 ) ∨ B1,1 )
Symbol ¬ should appear next to each literal: DeMorgan
¬(α ∨ β) ≡ ¬α ∧ ¬β and ¬(α ∧ β) ≡ ¬α ∨ ¬β

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )
Eliminate ⇒ by replacing α ⇒ β with ¬α ∨ β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬(P1,2 ∨ P2,1 ) ∨ B1,1 )
Symbol ¬ should appear next to each literal: DeMorgan
¬(α ∨ β) ≡ ¬α ∧ ¬β and ¬(α ∧ β) ≡ ¬α ∨ ¬β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ ((¬P1,2 ∧ ¬P2,1 ) ∨ B1,1 )

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )
Eliminate ⇒ by replacing α ⇒ β with ¬α ∨ β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬(P1,2 ∨ P2,1 ) ∨ B1,1 )
Symbol ¬ should appear next to each literal: DeMorgan
¬(α ∨ β) ≡ ¬α ∧ ¬β and ¬(α ∧ β) ≡ ¬α ∨ ¬β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ ((¬P1,2 ∧ ¬P2,1 ) ∨ B1,1 )
Distribute ∨ over ∧ and flatten

[email protected] (@neiu.edu) AI July 21, 2021 44 / 70


Resolution
Conjunctive Normal Form (CNF)

Every sentence in propositional logic can be expressed as


conjunctions of disjunctions of literals.

e.g. (A ∨ B) ∧ (¬C ∨ D ∨ ¬E) ∧ . . .

B1,1 ⇐⇒ (P1,2 ∨ P2,1 ) in CNF?


Eliminate ⇐⇒ replacing α ⇐⇒ β with (α ⇒ β) ∧ (β ⇒ α)
(B1,1 ⇒ (P1,2 ∨ P2,1 )) ∧ ((P1,2 ∨ P2,1 ) ⇒ B1,1 )
Eliminate ⇒ by replacing α ⇒ β with ¬α ∨ β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬(P1,2 ∨ P2,1 ) ∨ B1,1 )
Symbol ¬ should appear next to each literal: DeMorgan
¬(α ∨ β) ≡ ¬α ∧ ¬β and ¬(α ∧ β) ≡ ¬α ∨ ¬β
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ ((¬P1,2 ∧ ¬P2,1 ) ∨ B1,1 )
Distribute ∨ over ∧ and flatten
(¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬P1,2 ∨ B1,1 ) ∧ (¬P2,1 ∨ B1,1 )
[email protected] (@neiu.edu) AI July 21, 2021 44 / 70
Resolution
An algorithm

Algorithm works using proof by contradiction.

To show KB |= α we show that KB ∧ ¬α is not satisfiable

Apply resolution to KB ∧ ¬α in CNF

and Resolve pairs with complementary literals


l1 ∨ ... ∨ lk , m1 ∨ ... ∨ mn
l1 ∨ ...li−1 ∨ li+1 ... ∨ lk ∨ m1 ∨ ... ∨ mj−1 ∨ mj+1 ... ∨ mn
if li and mj are complimentary literals

and add new clauses

until
there are no new clauses to be added
two clauses resolve to the empty class, which means KB |= α
[email protected] (@neiu.edu) AI July 21, 2021 45 / 70
Resolution
An algorithm

function PL-Resolution(KB,q)
// KB, the knowledge base. a sentence in prop logic.
// q, the query, a sentence in prop logic
clauses= contra(KB,q) //CNF representation of KB ∧ ¬q
new = {}
do
for each pair of clauses Ci,Cj in clauses
resolvents=PL-Resolve(Ci,Cj)
if resolvents contains the empty clause
return true
new = new + resolvents
if new is subset of clauses
return false
clauses = clauses + new

[email protected] (@neiu.edu) AI July 21, 2021 46 / 70


Resolution
Algorithm

Say the agent is in [1,1], no breeze, so no pits can be in there.


α = ¬P1,2
KB = R2 ∧ R4
KB = (B1,1 ⇐⇒ (P1,2 ∨ P2,1 )) ∧ ¬B1,1
KB ∧ ¬α = (¬P2,1 ∨ B1,1 ) ∧ (¬B1,1 ∨ P1,2 ∨ P2,1 ) ∧ (¬P1,2 ∨ B1,1 ) ∧ (¬B1,1 ) ∧ (P1,2 )

[email protected] (@neiu.edu) AI July 21, 2021 47 / 70


Inference
Forward and Backward Chaining

Horn Form
KB conjunction of Horn clauses
Horn Clause (at most one literal is Positive7 )
For example: (¬P ∨ ¬Q ∨ V ) is a Horn Clause.
so is (¬P ∨ ¬W ). But, (¬P ∨ Q ∨ V ) is not.
Definite Clauses: exactly one literal is positive.

Horn clauses can be re-written as implications


proposition symbol (fact) or
conjunction of symbols (body or premise) ⇒ symbol (head)
Example: (¬C ∨ ¬B ∨ A) becomes (C ∧ B ⇒ A)
Modus ponens for Horn KB:
α1 . . . αn , α1 ∧ . . . αn ⇒ β
β
7
Not negated
[email protected] (@neiu.edu) AI July 21, 2021 48 / 70
Inference
Forward Chaining

function PL-FC-Entails(KB,q)
// KB, the knowledge base, a prop. sentence
// q, the query, a prop. sentence
count = a table //count[c] is num of symbols in c 0 s premise
inferred = a table //inferred[s] initially false for all s
agenda = a queue of symbols //Init w/symbols that are true

while agenda is not empty


p = pop(agenda)
if p=q then return true
if inferred[p]=false
inferred[p]=true
for each clause c in KB that contains p in premise
decrement count[c]
if count[c]=0
add c.conclusion to agenda
return false

[email protected] (@neiu.edu) AI July 21, 2021 49 / 70


Inference
Forward Chaining

P ⇒ Q
L∧M ⇒ P P
B ∧L ⇒ M
A∧P ⇒ L M
A∧B ⇒ L
A L
B

A B
(a) (b)

Figure 7.16 FILES: figures/pl-horn-example.eps (Tue Nov 3 13:45:07 2009). (a) A set of Horn
clauses. (b) The corresponding AND – OR graph.

[email protected] (@neiu.edu) AI July 21, 2021 50 / 70


Inference
Forward Forward
Chaining chaining example

1
P
P⇒Q
2 L∧M ⇒P
M B∧L⇒M
A∧P ⇒L
2 A∧B ⇒L
L A
B
2 2

A B

[email protected] (@neiu.edu) AI Chapter 7 45


July 21, 2021 51 / 70
Inference
Forward Chaining
Forward chaining example

1
P P⇒Q
2 L∧M ⇒P
B∧L⇒M
M A∧P ⇒L
2 A∧B ⇒L
L A
B
1 1

A B

[email protected] (@neiu.edu) AI July 21, 2021 52 / 70


Inference
Forward Chaining
Forward chaining example

1
P P⇒Q
2 L∧M ⇒P
B∧L⇒M
M
A∧P ⇒L
1 A∧B ⇒L
L A
B
1 0

A B

[email protected] (@neiu.edu) AI July 21, 2021 53 / 70


Inference
Forward Chaining
Forward chaining example

1
P P⇒Q
1 L∧M ⇒P
B∧L⇒M
M A∧P ⇒L
0 A∧B ⇒L
L A
B
1 0

A B

[email protected] (@neiu.edu) AI July 21, 2021 54 / 70


Inference
Forward Chaining
Forward chaining example

0
P P⇒Q
0 L∧M ⇒P
B∧L⇒M
M A∧P ⇒L
0 A∧B ⇒L
L A
B
0 0

A B

[email protected] (@neiu.edu) AI July 21, 2021 55 / 70


Inference
Forward Chaining
Forward chaining example

0
P P⇒Q
0 L∧M ⇒P
B∧L⇒M
M A∧P ⇒L
0 A∧B ⇒L
L A
B
0 0

A B

[email protected] (@neiu.edu) AI July 21, 2021 56 / 70


Inference
Forward Chaining
Forward chaining example

0
P P⇒Q
0 L∧M ⇒P
B∧L⇒M
M A∧P ⇒L
0 A∧B ⇒L
L A
B
0 0

A B

[email protected] (@neiu.edu) AI July 21, 2021 57 / 70


Inference
Backward Chaining (B.C.)

Work backwards from query q

To prove q by B.C.
check if q is known or
prove by B.C. all premises of some rule concluding q
Avoid Loops: Check if new subgoal is already in goal stack Avoid

repeat work: Check if new subgoal


has already been proved true or
has already failed

[email protected] (@neiu.edu) AI July 21, 2021 58 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 59 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 60 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 61 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 62 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 63 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 64 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 65 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 66 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 67 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 68 / 70


Inference
Backward Chaining
Backward chaining example

A B

[email protected] (@neiu.edu) AI July 21, 2021 69 / 70


Forward and Backward Chaining
Discussion

FC is data driven. E.g. object recognition, routine decision


FC may do a lot of work irrelevant to the goal
BC is goal driven. Appropriate for problem solving. I.e. Where is
home? What’s the result of equation x?
Complexity of BC can be much less than linear size of KB

[email protected] (@neiu.edu) AI July 21, 2021 70 / 70

You might also like