0% found this document useful (0 votes)
8 views46 pages

TTNT 05

Uploaded by

Sơn Thành
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views46 pages

TTNT 05

Uploaded by

Sơn Thành
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

Artificial Intelligence

Hai Thi Tuyet Nguyen

1
Outline
CHAPTER 1: INTRODUCTION (CHAPTER 1)
CHAPTER 2: INTELLIGENT AGENTS (CHAPTER 2)
CHAPTER 3: SOLVING PROBLEMS BY SEARCHING (CHAPTER 3)
CHAPTER 4: INFORMED SEARCH (CHAPTER 3)
CHAPTER 5: LOGICAL AGENT (CHAPTER 7)
CHAPTER 6: FIRST-ORDER LOGIC (CHAPTER 8, 9)
CHAPTER 7: QUANTIFYING UNCERTAINTY(CHAPTER 13)
CHAPTER 8: PROBABILISTIC REASONING (CHAPTER 14)
CHAPTER 9: LEARNING FROM EXAMPLES (CHAPTER 18)
2
5.1 Knowledge-Based Agents

CHAPTER 5: 5.2 The Wumpus World


5.3 Logic
5.4 Propositional Logic
LOGICAL AGENT 5.5 Propositional Theorem Proving
5.6 Inference Rules, Theorem Proving

3
5.1 Knowledge-Based Agents
● Knowledge base (KB) = a set of sentences in a formal language (i.e., knowledge
representation language)
● Declarative approach to build an agent:
○ TELL it what it needs to know
○ ASK itself what to do - answer should follow from the KB

4
5.1 A simple knowledge-based agent.
● The agent takes a percept as input and returns an action.
It maintains a knowledge base
● How it works:
○ TELLs the knowledge base what it perceives.
○ ASKs the knowledge base what action it should perform.
○ TELLs the knowledge base which action was chosen, and executes the action.

5
5.2 Wumpus World PEAS description
● Performance measure:
○ +1000: climb out of the cave with the gold,
○ –1000: fall into a pit or being eaten by the wumpus
○ –1: each action
○ –10: use up the arrow.
○ The game ends: the agent dies or it climbs out of the cave
● Environment: A 4 × 4 grid of rooms.
○ Start location of the agent: the square labeled [1,1]
○ Locations of the gold and the wumpus: random
● Actuators: Move Forward, Turn Left, Turn Right, Grab, Climb, Shoot
● Sensors: the agent will perceive
○ Stench: in the square containing the monster (called wumpus) and in the directly adjacent squares
○ Breeze: in the squares directly adjacent to a pit
○ Glitter: in the square where the gold is
○ Bump: into a wall
○ Scream: anywhere in the cave when the wumpus is killed
6
5.2 Wumpus World PEAS description
● The first percept is [None,None,None,None,None] => the agent can conclude that its
neighboring squares, [1,2] and [2,1], are OK.

7
5.2 Wumpus World PEAS description
The agent decides to move forward to [2,1].

8
5.2 Wumpus World PEAS description
● The agent perceives a breeze (denoted by “B”) in [2,1] => there must be a pit in a
neighboring square.
● The pit cannot be in [1,1] => so there must be a pit in [2,2] or [3,1] or both.

9
5.2 Wumpus World PEAS description
The agent will turn around, go back to [1,1], and then proceed to [1,2].

10
5.2 Wumpus World PEAS description
● The agent perceives a stench in [1,2] => there must be a wumpus nearby ([2,2] or [1,3])
● The lack of stench when the agent was in [2,1] => wumpus cannot be in [2,2] => it is in [1,3]
● the lack of a breeze in [1,2] => there is no pit in [2,2]
=> [2,2]: safe, OK

11
5.2 Wumpus World PEAS description
The agent draws a conclusion from the available information,
that conclusion is guaranteed to be correct if the available information is correct.

12
5.3 Logic
● Logics are formal languages for representing information
● Syntax defines the sentences in the language
E.g., “x + y = 4” is a well-formed sentence, whereas “x4y+ =” is not

● Semantics defines the “meaning” of sentences


○ The semantics defines the truth of each sentence with respect to each possible world (i.e.,
model).
E.g., the sentence “x + y = 4” is true in a world where x is 2 and y is 2,
but false in a world where x is 1 and y is 1

13
5.3 Entailment
● Entailment means that one thing follows from another:
● Knowledge base KB entails sentence α if and only if α is true in all worlds
where KB is true
KB |= α
● E.g., KB containing “the Giants won” and “the Reds won” entails “Either the Giants won
or the Reds won”

14
5.3 Models
● Models are formally structured worlds with respect to which truth can be evaluated
● We say m is a model of a sentence α if α is true in m
● M(α) is the set of all models of α
● Then KB |= α if and only if M(KB) ⊆ M(α)

E.g. KB = Giants won and Reds won


α = Giants won

15
5.3 Inference and Entailment
● An inference algorithm is a procedure for deriving a sentence from the KB
● If an inference algorithm i can derive α from KB, we write
KB ⊢i α
which is pronounced “α is derived from KB by i” or “i derives α from KB”
OR: the sentence α is inferred from KB using algorithm i.

16
5.4 Propositional logic: Syntax
● Propositional logic is the simplest logic - illustrates basic ideas
● The proposition symbols P1, P2, … are sentences
● If S is a sentence, ¬S is a sentence (negation)
● If S1 and S2 are sentences, S1 ∧ S2 is a sentence (conjunction)
● If S1 and S2 are sentences, S1 ∨ S2 is a sentence (disjunction)
● If S1 and S2 are sentences, S1 ⇒ S2 is a sentence (implication)
● If S1 and S2 are sentences, S1 ⇔ S2 is a sentence (biconditional)

17
5.4 Propositional logic: Semantics
● Each model specifies true/false for each proposition symbol
E.g. P1,2 P2,2 P3,1
true true false

● Rules for evaluating truth with respect to a model m:

18
5.4 Truth tables for connectives

19
5.4 A simple knowledge base - Wumpus world
sentences
Px,y is true if there is a pit in [x, y].

Bx,y is true if the agent perceives a breeze in [x, y].

There is no pit in [1,1]: R1 : ¬P1,1 .

A square is breezy if and only if there is a pit in a neighboring square.


R2: B1,1 ⇔ (P1,2 ∨ P2,1).
R3 : B2,1 ⇔ (P1,1 ∨ P2,2 ∨ P3,1).
The breeze percepts on the first two squares of the agent
R4 : ¬B1,1.
R5 : B2,1.

20
5.4 A simple knowledge base - Wumpus world
sentences
● Goal: to decide whether KB |= α for some sentence α
KB, α as ¬P1,2
prove: KB |= ¬P1,2
● A model-checking approach:
○ enumerate the models
○ check that α is true in every model in which KB is true

21
5.4 A simple knowledge base - Wumpus world
sentences
With 7 symbols, there are 27 = 128 possible models; in 3 of these, KB is true.
In those 3 models, ¬P1,2 is true or there is no pit in [1,2].

22
5.5 Propositional Theorem Proving
● Determine entailment by theorem proving: applying rules of inference directly to the
sentences in our knowledge base.
● Some additional concepts related to entailment:
○ Logical equivalence
○ Validity
○ Satisfiability

23
5.5 Logical equivalence
Two sentences are logically equivalent iff true in same models:
α ≡ β if and only if α |= β and β |= α

24
5.5 Validity and satisfiability
● A sentence is valid if it is true in all models, e.g., True, A∨¬A, A ⇒ A
○ Valid sentences are also known as tautologies
○ Validity is connected to inference:
KB|=α if and only if (KB ⇒ α) is valid

● A sentence is satisfiable if it is true in some models, e.g., A ∨ B, C


○ A sentence is unsatisfiable if it is true in no models
E.g., A ∧ ¬A
○ Satisfiability is connected to inference:
KB |= α if and only if (KB ∧ ¬α) is unsatisfiable

25
5.5 Inference and Apply biconditional elimination to R2 to obtain
proofs
R6: (B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1)
Modus Ponens
Apply And-Elimination to R6 to obtain
And-Elimination R7 : ((P1,2 ∨ P2,1) ⇒ B1,1) .

Logical equivalence for contrapositives gives


E.g. Wumpus world
R8 : (¬B1,1 ⇒ ¬(P1,2 ∨ P2,1)) .
R1 : ¬P1,1
R2 : B1,1 ⇔ (P1,2 ∨ P2,1) Apply Modus Ponens with R8 and the percept R4
R3 : B2,1 ⇔ (P1,1 ∨P2,2 ∨ P3,1) to obtain
R4 : ¬B1,1 R9: ¬(P1,2∨P2,1).
R5 : B2,1
Apply De Morgan’s rule, giving the conclusion
prove ¬P1,2
R10 : ¬P1,2 ∧ ¬P2,1 .
26
5.5 Proof by resolution
● Conjunctive Normal Form (CNF—universal)
conjunction of clauses (i.e., disjunctions of literals)
E.g., (A ∨ ¬B) ∧ (B ∨ ¬C ∨ ¬D)

Convert R2 : B1,1 ⇔ (P1,2 ∨ P2,1) into CNF


1. Eliminate⇔
(B1,1 ⇒ (P1,2 ∨ P2,1)) ∧ ((P1,2 ∨ P2,1) ⇒ B1,1) .

2. Eliminate ⇒
(¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬(P1,2 ∨ P2,1) ∨ B1,1) .

3. Apply logical equivalences

(¬B1,1 ∨ P1,2 ∨ P2,1) ∧ ((¬P1,2 ∧ ¬P2,1) ∨ B1,1) .

4. Apply the distributivity law


27
(¬B1,1 ∨ P1,2 ∨ P2,1) ∧ (¬P1,2 ∨ B1,1) ∧ (¬P2,1 ∨ B1,1) .
5.5 Proof by resolution

● Resolution inference rule (for CNF)

where li and mj are complementary literals (i.e., one is the negation of the other).

28
5.5 Proof by resolution
● Inference procedures based on a resolution algorithm uses the principle of proof by
contradiction
KB |= α if and only if (KB ∧ ¬α) is unsatisfiable

● Steps:
1. (KB ∧ ¬α) is converted into CNF
2. The resolution rule is applied to the resulting clauses, a new clause is added to the set if it is not
already present
3. The process continues until one of two things happens:
i. no new clauses that can be added, in which case KB does not entail α;
ii. two clauses resolve to yield the empty clause (~False), in which case KB entails α.

29
5.5 Proof by resolution
E.g. Wumpus world

R1 : ¬P1,1.
R2 : B1,1 ⇔ (P1,2 ∨ P2,1).
R3 : B2,1 ⇔ (P1,1 ∨P2,2 ∨ P3,1).
R4 : ¬B1,1.
R5 : B2,1.

Prove ¬P1,2 by resolution

30
5.5 Proof by resolution KB =R2 ∧R4 =(B1,1 ⇔ (P1,2 ∨P2,1))∧¬B1,1

α as ¬P1,2
1. convert (KB ∧ ¬α) to CNF
(¬P1,2 ∨ B1,1) ∧ (¬B1,1 ∨ P1,2 ∨ P2,1)
∧ (¬P2,1 ∨ B1,1) ∧ ¬B1,1 ∧ ¬P1,2.
2. resolve pairs
(¬P1,2 ∨ B1,1), (¬B1,1 ∨ P1,2 ∨ P2,1): P2,1
(¬P1,2 ∨ B1,1), ¬B1,1: ¬P1,2
(¬P2,1 ∨ B1,1), (¬B1,1 ∨ P1,2 ∨ P2,1): P1,2
(¬P2,1 ∨ B1,1), ¬B1,1: P2,1
3. resolve pairs
¬P1,2, P1,2: empty
Result: KB |= ¬P1,2
31
5.5 Proof by resolution
Proof by contradiction, i.e., show KB ∧ ¬α unsatisfiable

32
5.5 Proof by resolution
KB = (B1,1 ⇔ (P1,2 ∨ P2,1)) ∧ ¬B1,1
α = ¬P1,2

33
5.5 Horn clauses and definite clauses
● Definite clause: a disjunction of literals of which exactly one is positive.
E.g., (¬L1,1 ∨ ¬Breeze ∨ B1,1) is a definite clause
● Horn clause: a disjunction of literals of which at most one is positive
● Goal clauses: clauses with no positive literals

34
Figure 7.14 A grammar for conjunctive normal form, Horn clauses, and definite clauses.
5.5 Forward chaining
Idea: fire any rule whose premises are satisfied in the KB,
add its conclusion to the KB, until query is found or no further inferences can be made.

35
5.5 Forward chaining
● In AND–OR graphs,
○ multiple links joined by an arc indicate a conjunction
○ multiple links without an arc indicate a disjunction
● How the graphs work:
○ The known leaves are set, inference propagates up the graph as far as possible.
○ Where a conjunction appears, the propagation waits until all the conjuncts are known before proceeding.

36
5.5 Forward chaining

37
5.5 Forward chaining

38
5.5 Forward chaining

39
5.5 Forward chaining

40
5.5 Backward chaining
● Idea: work backwards from the query q to prove q by BC,
○ check if q is known already, or
○ prove by BC all premises of some rule concluding q

● Avoid loops: check if new subgoal is already on the goal stack


● Avoid repeated work: check if new subgoal
○ 1) has already been proved true, or
○ 2) has already failed

41
5.5 Backward chaining

42
5.5 Backward chaining

43
5.5 Backward chaining

44
5.5 Backward chaining

45
5.5 Forward vs. backward chaining
● FC is data-driven, cf. automatic, unconscious processing,
○ e.g., object recognition, routine decisions
○ May do lots of work that is irrelevant to the goal

● BC is goal-driven, appropriate for problem-solving,


○ e.g., Where are my keys? How do I get into a PhD program?

46

You might also like